1.1 Fear of the double

A deeply held suspicion towards images and symbols has left its mark on both science and art. Literary history abounds with faulty replicas and despicable twins, from the watery reflection of Narcissus to Dr Jekyll’s murderous alter ego, or the seductive usurper in Dostoevsky’s The Double. They appear as living specters: likenesses that unearth painful truths for their “real-life” counterparts leading them to madness, solipsism and death.

Yet a splintered self is not always a hindrance. In the Tang Dynasty folk tale, “The Divided Daughter,” our protagonist defies her parents’ wishes by running off to live in the imperial capital with her lover. Five years later, when feelings of guilt drive her home again, the young woman’s father explains that his daughter has been bedridden during the same period and never left the house. The prodigal daughter enters her former bedroom and the two women become one – grateful for everything an indivisible self could never have experienced.⁠1

Such misgivings in western literature may well be an output of classical western philosophy, with its tendency to boil over into full-blown existential paranoia. Plato’s famous thought experiment ought not to have been taken literally but has been faithfully upgraded over the centuries – whether in the shape of Descartes’ Deus deceptor or the computational simulation of David Chalmers’s Reality+: Virtual Worlds and the Problems of Philosophy from 2022.

Perhaps a greater understanding of what simulations are and do would decrease the likelihood of future philosophers assuming their imposition from outside. In Beyond Good and Evil (1866) , Nietszche dismissed the theory that an operating system runs beneath our capacity to detect it. “It is, in fact,” he wrote, “the worst proved supposition in the world.” Of the simulation hypothesis specifically, Nobel Prize-winning physicist Frank Wilczek soberly asks: “If this is true, what is the thing doing the simulating made from?” Simulations are a tool – a constructive apparatus that is most fruitful where knowledge otherwise breaks down. As such they play a pivotal role in both diversion and discovery.

Court Ladies Adorning Their Hair with Flowers, Zhou Fang (late 8th-early 9th century A.D.)
Court Ladies Adorning Their Hair with Flowers, Zhou Fang (late 8th-early 9th century A.D.)

1.2 What are simulations?

A simulation is a dynamic reconstruction of a system or process. It is a representation of complex reality. Simulations are neither weak instruments nor flawed copies of a “more true” reality but rank among our most effective strategies for producing knowledge about the world.

Human beings create meaning through tools of imitation, things like words, numbers, or symbols. We layer semiotic information on the reservoir of incomprehensible chaos that surrounds us – the signals we must transform into knowing. We have no access to reality in any pure, unmediated sense. Whether using nervous systems, sentences, or a 3D model of an object, we construct channels to make contact with the world which later become our primary means of apprehending it.

The terms “model” and “simulation” are often used interchangeably. They are closely linked but are not quite the same. Model, from the Latin modulus, meaning “measure,” refers to an “informative representation of an object, person, or system.” Models are “interpreted structures” whose purpose can be folded in with their ontology.⁠2 It is a common mistake to assume that all models aspire towards verisimilitude, or realistic representation. But by their very nature they must target salient features selected according to the model maker’s purposes.

Simulations are best described as the execution of a model, or multiple overlapping models, over time. Whereas you build a model, you run a simulation. Both are types of measuring, but where models target the conceptualization or functional abstraction of real-world systems, simulations are more focused on their implementation. What’s more, a simulation can be disturbed, idealized or incrementally transformed in order to record the shifting correspondence between it and the system it aims to represent. Noise and randomness can be introduced to create stochastic simulations with little idea what the outcome will be.

Simulations are a crucial interpretative tool for capturing reality in a shared context. In this way they resemble other cognitive framing technologies – distinct categories we might be tempted to rank in terms of usefulness. Our instinctive bias as carbon-based intelligences might lead us to privilege in vitro experimentation over in virtuo computer simulations, seeing one as “material” and the other less obviously so. But they are equivalent, even if different, and often complementary.

In certain circumstances a computer simulation might be the only possible approach. When studying the origins of the universe, for example, or unpacking the formation of planetary systems from interstellar dust, neither of which can be recreated in the lab. Much like our own sensory apparatus, simulations cannot tend to all aspects of reality at once and are forced to be selective.

In Reconstructing Reality (2014) , the philosopher Margaret Morrison argues that there is no justification for epistemically privileging the results of experiments over knowledge accessed using idealizations, abstractions and fictional models. One reason for this is that impossible mathematics can enable new discoveries. An infinite system is essential to explain phase transitions in statistical mechanics, for example, and an infinite population has improved results in genetics despite the fact no such population could exist.

A simpler way to understand the significance of simulations in the modern world would be to consider airline pilots. Each pilot is trained in a flight simulator containing software that is far more complex than the code onboard an actual airplane needs to be. Every nine months, pilots return to the airline’s training facilities to make sure they’re familiar with the latest procedures, emergency situations, and aircraft systems. These expensive devices are built from physical, mathematical and computational parts, the three categories into which most simulations can be placed.

John Easterby, engineering specialist, (left), and Colonel John A Graf, check readings on the Bay Model, June 12 1957 (SF Chronicle)
John Easterby, engineering specialist, (left), and Colonel John A Graf, check readings on the Bay Model, June 12 1957 (SF Chronicle)

1.3 Three types of model

John Reber was an amateur musical theater producer and schoolteacher in 1950s California who was anxious about the Bay Area’s fresh water supply. Despite a lack of expertise, he developed an ambitious plan to dam up the bay. Reber was no city planner, but he was influential. He knew how to generate funds and amass support. Fearing that the plan might actually be given the go-ahead, in 1957 the US Army Corps of Engineers built a hydraulic scale model to test Reber’s plan complete with pumps that simulated fluid dynamics in the Bay, including tides, currents, and even the salinity barrier where fresh and saltwater met.

The model was 1:1000 scale horizontally, 1:100 scale vertically, and replicated the 24 tidal cycle just under once every fifteen minutes. Although Reber died before seeing the results, the model proved that the outcome of damming the San Francisco Bay would be flooding, structural damage and an enormous waste of resources. The Bay Area Model later became a museum: a remarkable example of a “concrete” or “physical” model like many others which have aided scientific experimentation or determined future projects for centuries.⁠3

The majority of scientific models fall into one of three categories: they are either concrete, mathematical or computational. Beyond this there are edge cases like “model organisms,” which are often widely available and easy to handle analogs such as yeast or E.coli (or mice for mammals) with biochemistry, gene regulation and behavior that is regular across the class under study. More often than not they conform to one of three groups.⁠4

Examples of mathematical models might include the Lotka-Volterra model of predator-prey dynamics developed to anticipate how fish stocks in the Adriatic would respond to increased fishing activity following World War One. An example of a well-known computational model is the economist Thomas Schelling’s “segregation model” which used a 51×51 square grid and a random distribution equilibrium to transform even a mild preference towards in-group neighbors into notable segregation without the need for economic or policy interventions.

Schelling's model of segregation

Simulations remain closely tied to the purpose for which they were designed. At the same time there are numerous features that often recur, such as whether they are stochastic, idealized or deterministic (meaning whether randomness, noise, or completeness is intentionally introduced). They might be continuous or discrete, or fit one of the most frequently used simulations in a particular field, such as Monte Carlo in finance, N-body simulations in astronomy, or reaction diffusion models in biochemistry.

1.4 Simulations and simulationism

There are many ways that simulations are deployed in search of ground truth and many that intentionally divert us from it. These other, parallel simulations may take place in virtual or augmented reality in the form of game worlds or social simulations. They may take advantage of spectacular 3D environments or the phenomenological superimposition of spatial computing, but there is no technological precondition for creating simulations capable of reshaping how society understands itself.

There are already thousands of cultural models which actively resist any incursion from fundamental or baseline reality. These closed worlds include everything from dictatorial or theocratic regimes to ineradicable nostalgias maintained in the present and projected on the past.⁠5 Any suite of aesthetic properties, value judgements and moral suppositions that allow groups to cohere around a shared purpose – be that a nation, worldview, personality or brand – runs the risk of fixity: a simulation immune to data signals that ought to update the model.

This is perhaps the paradox at the heart of simulations. For every cosmological model that directs thought towards a deeper understanding of the universe, there is a mirror somewhere in the world ready to distract us from it. Crucially, they most likely run on the same computers, processed using algorithms that can trace their shared ancestry without much difficulty. It’s important to ask how parallel technologies can end up with a wildly different relationship to truth – especially in a technological landscape where simulations themselves become sovereign actors – the basis for both human, machine and planetary intelligence – a core governance mechanism animating factories, designing cities and deploying geotechnologies.

Continue Reading

Program Director Benjamin Bratton Studio Director Nicolay Boyadjiev Associate Director Stephanie Sherman Senior Program Manager Emily Knapp Network Operatives Dasha Silkina Andrey Karabanov
Thanks to The Berggruen Institute and One Project for their support for the inaugural year of Antikythera.