3.1 All but war is simulation

Conflict simulations have been in use since violence spilled out of tribes and metastasized into war. The games chess and Go, for example, were both inspired by military simulations. The first victory believed to be related to extensive simulation training is linked to the strategy board game Kriegsspiel – from the German for “war-game” – developed in 1811 for the new officer class of the Prussian Army.

The New War Game, "Polemos", as played at the Royal United Service Institution, Engraving by Johann Nepomuk Schonberg (1888)

Kriegsspiel does not exist to generate useful data but to train generals in a variety of hypothetical situations making them better able to analyze, interpret and take decisions during battle. Although the game was shopped around in France and Russia, nobody outside Germany took much interest until the Prussian Army delivered a decisive defeat in the Franco-Prussian War (1870-1871) despite having no clear advantage in terms of munitions, soldiers or experience. Instead it was their cadre of officers that were praised: a generation of strategists raised with tabular calculations on a synthetic battlefield.⁠13

In Kriegsspiel, outcomes are decided using calculation tables. Various mathematical procedures have been devised to predict everything from the effect of morale on outcomes to supply chain management. Two of the best known are Lanchester’s linear and square law as formulated by the British engineer Frederick Lanchester in 1914. The Square Law is often known as “the attrition model” and suggests that the power of a modern military force is proportional to the square of its number of units (the linear law meanwhile was applied to pre-industrial combat using spears rather than long-range weapons). A later update was the Salvo Model, developed to represent naval combat as a series of salvos of missiles exchanged between opposing fleets.⁠14

Throughout the 20th century a greater number of variables – political, environmental, energetic – were incorporated into military simulations. It was in the United States that simulation practice and technology was elevated to the level of policy, evolving in tandem with the development of a material substrate on which the necessary computations for predicting outcomes would be run: transistors and semiconductors.⁠15

A permanent wargaming facility was created at The Pentagon in 1954 under the motto “All but war is simulation,” attributed to the 17th century samurai-philosopher Miyamoto Musashi who emphasized mental preparation and visualization in advance of combat. The American general Douglas MacArthur added his own spin, writing that “in no other profession are the penalties for employing untrained personnel so appalling or so irrevocable as in the military.”

Man holding up a magnifying glass for a better look at the samurai swordsman Miyamoto Musashi, woodblock print by Ichiyusai Kuniyoshi.
Man holding up a magnifying glass for a better look at the samurai swordsman Miyamoto Musashi, woodblock print by Ichiyusai Kuniyoshi.

The longest-standing pact between services of the US military is a 1950 agreement between the Army and Navy about sharing training devices. In 1966, the agencies responsible for training soldiers moved to Orlando where they introduced a synthetic flight training system for the Bell UH1 huey helicopter – the “workhorse” of the air cavalry during the Vietnam War.⁠16 Around the same time the Naval Training Device Center introduced the Multiple Integrated Laser Engagement System, or MILES, which used lasers and blank cartridges to enhance the realism of training and remained in constant use until the 2020s. The technology was not limited to military use however. A similar system was released as a set of phaser guns by games manufacturer Milton Bradley to coincide with the release of Star Trek: The Motion Picture (1979), creating the market for recreational laser tag.

Scene Caption: Wearing a Multiple Integrated Laser Engagement System (MILES) gear and armed with an M16A2 assault rifle Airman First Class Jodey Powell with the 94th Security Forces Squadron, Dobbins Air Force Base, Georgia repels attacking opposition forces at a mock village on Fort Dix, New Jersey.
Soldier with Multiple Integrated Laser Engagement System (MILES) gear repels attacking opposition forces at a mock village on Fort Dix, New Jersey.

3.2 America’s Army and Unreal Engine

Over time, the production of military simulations gradually opened up to civilian researchers and private institutions. RAND Corporation, Harvard, the National Defense University and MIT each ran simulations for the Pentagon, which included modeling the Vietnam War, the fall of the Shah of Iran, tensions between India, Pakistan and China. In 1998, the Army established a contract with the University of Southern California’s Institute for Creative Technologies in Los Angeles. Although the history of RAND is inflected with Hollywood thanks to Herman Kahn’s nuclear “screenplay-scenarios,” this new contract represented a technology overlap between entertainment and the military that is rooted more in video games than cinema.

Gameplay from America's Army (2002)
Gameplay from America’s Army (2002)

The US Army and software company Epic Games worked together for eight years to produce the second generation of Epic Games’ graphics engine Unreal. The engine was launched along with the game it was designed to build: America’s Army (2002), a first-person tactical shooter that provided a virtual experience of soldier training and deployment, and was designed at the Modeling, Virtual Environments and Simulation Institute at the Naval Postgraduate School in Monterey, California.

The game was commissioned by Casey Wardynski , director of the Office of Economic and Manpower Analysis at West Point, who told the New York Times its original aim was to cut one of the army’s biggest expenses – recruitment. “If the game draws 300 to 400 recruits,” he said, “it will have been worth the cost.”

A 2008 study from the Massachusetts Institute of Technology found that “30 percent of all Americans ages 16 to 24 had a more positive impression of the Army because of the game” and, even more surprisingly, “the game had more impact on recruits than all other forms of Army advertising combined.” Counter to expectations, the game was a success with civilians, too, described as the “biggest surprise of the year” by IGN.⁠17

Interservice/Industry Training, Simulation, & Education Conference (I/ITSEC, Orlando, FL
Interservice/Industry Training, Simulation, & Education Conference (I/ITSEC, Orlando, FL

After the invasions of Iraq and Afghanistan training became more field-based. The motto became “train as you fight,” emphasizing the need to model complex urban and rural terrain non-conventional armies. In 2007 the various commercial, military and academic partners in Florida formally integrated as TEAM ORLANDO, presiding over an industry then worth $4.8 billion. Orlando became home to the National Center for Simulation but also where I/ITSEC (Interservice/Industry Training, Simulation and Education Conference) takes place which is probably the largest sim convention, essentially where America advertises its latest simulation tech to the world with other NATO members as its primary customers.

In 2010 the US Air Force Research Laboratory built what was then the 33rd-largest supercomputer in the world by combining 1,760 Sony PlayStation 3s. The “Condor Cluster” was used for radar enhancement, pattern recognition, satellite imagery processing, and artificial intelligence research, and was the fastest interactive computer in the entire US Defense Department for a time.

No combination of later consoles was able to rival the world’s fastest supercomputers, yet each progressive generation of video game consoles forced innovation in the field. They did this consistently due to the huge economies of scale and manufacturing subsidies offered by console producers who assumed that software purchases and percentages of in-game economies would recoup early losses. Yet it was the arms race between console makers and chip designers, fuelled by the public’s willingness to pay for improved graphics that produced the technical innovation needed to overcome countless bottlenecks and dead ends across the military, science and industry. This was the era of the GPU: the computational architecture that began a steep upward curve for simulation in all areas of life.

3.3 The origins of the GPU

One of the first personal computers to escape the laboratory into mass production was the Datapoint 2200 which integrated Intel’s 8008 CPU. Intel’s previous logic chip, the 4004 central processing unit was the world’s first commercially produced microprocessor, commissioned by the Japanese office devices firm Busicom for use in electronic calculators.⁠18 Although the idea of a chip explicitly dedicated to computer graphics was popularized by the launch of Nvidia’s GeForce 256 in 1999 – with the term itself reaching mainstream consciousness thanks to its use in the original Playstation five years before that – Nvidia’s frequent claim that theirs was the first specialized graphics accelerator is inaccurate.

The first playable computer games, Bertie the Brain (1950) , Tennis for Two (1958) and Space War (1962) ran on systems that were custom-built. In the same way, arcade machines such as Computer Space (1971) , Pong (1972) and later Space Invaders (1978) housed their hardware in tall fiberglass cabinets and used “video controllers” hard-coded to output visuals for each specific game.

Like a gallery in any arcade, this was a demonstration of what was and could be possible, even if private ownership of such devices was beyond the reach of most households. The first home games consoles – the Magnavox Odyssey (1972) and Atari 2600 (1977) – appeared at a time when computers of all stripes were still extremely expensive. Although the history gets a little difficult to parse, no home console seems to have had a purpose-built programmable graphics card until Nintendo’s Famicom in 1983.⁠19

A GPU is a computer chip with a variety of hardware-accelerated 3D rendering features, but in the early years this was more of a marketing term than signifying any real difference from existing CPUs. The Commodore Amiga, released in 1986, offloaded image processing to Texas Instruments’s 1986 processor TMS34010, but it wasn’t until intuitive graphical user interfaces such as Windows became popular that what we think of as GPUs today began to speciate and branch on a separate path.

That same year the Canadian company ATI launched its Wonder series of video card add-ons for IBM personal computers, which supported multiple displays and enabled users to switch resolutions. Yet they still relied on the system’s CPU for many core processing tasks. In the early 90s APIs like OpenGL (1992) and DirectX (1995) enabled programmers to write code for different graphics adapters creating with it a standard software platform that could be used by games studios. CD-ROMs enabled greater data storage and two titles, Virtua Racing (1992) and Doom (1993) were among the standouts that pioneered 3D (or at least 2.5D) graphics for PC gamers much as Sonic the Hedgehog (1991) and Street Fighter 2 (1991) had done on the Sega Genesis and Super Nintendo used parallax scrolling, a much richer color pallet and light-related techniques such as shadows.

Japanese ad for Virtua Racing, 1992
Japanese ad for Virtua Racing, 1992

By the time the original Playstation was released in 1995 consumer PC gaming was going through a slump with graphics cards such as the S3 ViRGE jokingly referred to by gamers as a “graphics decelerator.” The runaway success of Sony’s console lit a fire that spread through the hardware market. The 3DFX Voodoo by San Jose’s 3dfx Interactive, Inc., released in 1996, was the first to support a multi-GPU setup, a signal that maximizing resolution and frame rates by any means necessary would be holy grails for the industry.

3.4 GPUs in Ascendance

In 1999 Nvidia released the first GPU in its Geforce series for domestic PC gaming. The Geforce 256 was capable of processing complex visuals that included advanced lighting and rendered individual pixels, though it was only with the release of Geforce 2 (NV15) in 2000 that game developers were able to catch up and make use of what Nvidia founder Jensen Huang had hailed “the single largest chip ever built.”

In 2002, 3dfx went bankrupt after attempting the notoriously difficult process of fabricating its own chips and was acquired by the “fabless” Nvidia leaving just Canada’s ATI and Nvidia as the two most prominent companies in the field. In order to differentiate their product from that of their southern rival, Toronto-based ATI used the term VPU – or “visual processing unit” – and launched its rival to Geforce, the Radeon series , in 2000.

The launch of the Playstation 2 in 2000 and the Xbox in 2001 introduced standard definition, a resolution of 480p, to home consoles, but the febrile competition between ATI (who were acquired by AMD, the creators of Athlon and Opteron processors, for around $5.4 billion) and Nvidia meant that by the end of the 2000s PC gaming had comfortably dwarfed what the next generation of consoles could over – sustaining 1080p, high-definition resolution at frame rates above 60fps. In 2006 Nvidia released the Geforce 8800GTX – an immense, energy-hungry card with 681 million transistors, a 3-way linkable interface for the most ambitious gamers and a core clock rate of 575 Mhz, which was faster than the CPUs it was commonly paired with.⁠20 A crossover moment had arrived.

With the uncanny growth of resolution and frame rates in the 2010s, GPU technology finally foundered before the exponential curve ahead of them. The jump from 1080p to 4k, for instance, is a 400 percent leap which made computational and energetic demands that at this point were unjustifiable. Instead in the 2010s the same companies worked on software enhancements to the way lighting dynamics are calculated. They used deep learning to up-scale resolution, improving visual output without requiring additional power or a quicker frame rate pushing chip architecture to diversify once again.⁠21

The most popular Nvidia cards of the early 2020s were 2021’s RTX3090 , marketed as the world’s first 8k GPU, and 2023’s RTX4090. They each contain a distinct core dedicated to ray tracing and another for deep learning super sampling (DLSS) that uses AI to upscale resolution. It’s somewhat ironic that advances in machine learning throughout the 2010s enabling chips that can predict what we want to see were themselves made possible by targeted AI research using GPUs originally designed for gaming.

The power of generative AI and new hardware capable of running increasingly large, complex models spread everywhere from aircraft design to modeling fusion reactions and cosmological models capable of demonstrating how galaxies “breath in” gasses to create stars and let them fall away after those stars explode or die. The ability to model highly complex, dynamic environments opened new possibilities for scientific discovery with simulation integrated in the instruments and the results, but it also set the bar for training soldiers in the experience of war to guarantee the upper hand when they needed it most.⁠22

3.5 Virtual Battle Station

The Czech games studio Bohemia Interactive was founded by brothers Marek and Ondřej Španěl under its original name, Suma, in 1985. The duo had developed an obsession with computers after Marek bought a Texas Instruments TI-99/4A in the early 1980s and released a hovercraft-based military simulator called Gravon: Real Virtuality in 1995.

After various dead ends, the studio found success with Operation Flashpoint: Cold War Crisis in 2001, a tactical shooter set on a fictional archipelago whose islands were divided between the Soviet Union and the United States. The game sought to outdo competitors in terms of realism by 3D-scanning guns and even modeling individual eyeballs. It caught the attention of the US Army who began using a modded version of the game called DARWARS Ambush! to train soldiers – supported by with DARPA funding.

Bohemia would later set up a special division called Bohemia Interactive Simulations (BISim) exclusively to cater to military training and simulation. In 2001 they signed an exclusive contract with the US Marines to develop the first edition of their Virtual Battle Space (VBS) series. Where a game like Call of Duty or Counter-Strike might hone a player’s motor skills and reaction times, VBS incorporated the latest ordnance, bringing game controls into alignment with existing weaponry. Deployed in enormous Battle Simulation Centers in the US and overseas, soldiers sat side-by-side in monitor-filled rooms to practice the techniques, procedures, and outcomes they might encounter in the field. Much like Kriegsspiel it was an exercise in collective cognition enabled by simulation – establishing useful reference frames and teaching groups of humans how to think.

BISim soldiers posing for a photo

VBS3 became the entire army’s “game for training” when it was released in 2014. Meanwhile Bohemia Interactive continued to create games for civilian uses. In 2012, the company fell into financial difficulties after two of the studio’s staff, Martin Pezlar and Ivan Buchta, were arrested on suspicion of espionage and held for 129 days on the Greek island of Lemnos when they were caught taking photographs of military installations. If convicted, Pezlar and Buchta faced a 20-year sentence, ruining morale at Bohemia and overshadowing the launch of Carrier Command: Gaea Mission, the studio’s latest game. In the end, Czech president Václav Klaus stepped in and Bohemia encouraged gamers to petition the Greek government, both of which contributed to their eventual release.

3.6 One World Terrain

After the occupations of Iraq and Afghanistan the US Army, whose capital injections had rescued Bohemia Interactive more than once, stressed the need to improve close combat training on the front line – which accounted for just four percent of all personnel but where 90 per of total casualties occurred. The US’s 2018 National Defense Strategy included a commitment “to provide a cognitive, collective, multi-echelon training and mission rehearsal capability” and combine “virtual, constructive and gaming training environments into a single Synthetic Training Environment (STE) .” The commitment was both inspired by and an accelerator of a game studio culture increasingly devoted to generating large open worlds that more fully embraced the firepower provided by banks of next generation GPUs.

Bohemia set to work on a specific aspect of the mission known as One World Terrain, which aspired to create “a realistic, common, accessible and automated 3D terrain data set for simulation and, potentially, mission command and intelligence systems, to conduct collective training, mission rehearsal and execution” that could leverage “game-streaming technologies enabled by new advances in fiber optic and 5G wireless networks.” The result was VBS4, a step change in the series described by the company as “a whole-earth virtual desktop trainer and simulation host that allows you to create and run any imaginable military training scenario,” released to US and allied military customers at the start of 2020.

VBS4 Mixed reality tools. "Soldiers collaboratively mark adversary troop locations on a projected digital twin map."

VBS4 is more of a game engine than a game. It was specifically designed for military simulation and training. It simulates the whole planet, opening with a giant blue-green orb that can be manipulated from space and altered down to individual patches of grass. It provides users a relatively straightforward interface with which to build battle spaces and to integrate terrain data “from any conceivable source.” In a 2019 interview with PCMag, Australian Army veteran and Chief Commercial Officer of Bohemia Interactive Simulations, Peter Morrison, described how the system incorporates drone-collected 3D data. “The military can fly a drone at a low level, and using photogrammetry, a highly realistic 3D model of a huge area can be constructed,” Morrison said. “Our users can quickly pull that data into the game and use it as the basis of training.”⁠23

Trainees inhabit avatars within the simulation that they access either at their desktop or using devices such as headsets, vehicle cockpit simulators, advanced weapons simulators and parachute harnesses. Senior military figures use mission planning tools and deploy enemy soldiers to test and assess tactics – much as the Prussian Army had done while playing Kriegsspiel three hundred of years before. What was so obviously different was the scale and resolution of these operations: from a handful of calculations to a simulation that aimed to stand in for the globe and to grant users video game-style controls over it.

In 2022 Bohemia Interactive Simulations was acquired by British arms and aerospace manufacturer BAE systems for $200 million. In the press release accompanying the acquisition, the company noted that the global spend on military training simulation environments and related services was expected to surpass $11 billion each year. VBS4 is accessible through the cloud, enabling troops on deployment to simulate conditions in the next valley over by inputting tomorrow’s weather.

UNREAL GT extension for Unreal Engine to produce synthetic training data for self-driving cars.

As with previous innovations, like the MILES laser system, no sooner is a technology unveiled than it diffuses throughout the economy. The overlap between physics engines , collision detection, scene graphing and network structure developed in tandem with (and cannot be separated from) revelations from the field of artificial intelligence. Game engines built to engineer open worlds and graphics accelerators refined by fierce industrial competition led to significant breakthroughs that had no obvious connection to defense – in manufacturing, biotech and training AIs. By the start of the 2020s, the very same software platforms were being used to train autonomous vehicles, to plan cities and update climate prediction methodologies that hadn’t substantially changed since the 1970s.

Continue Reading

Program Director Benjamin Bratton Studio Director Nicolay Boyadjiev Associate Director Stephanie Sherman Senior Program Manager Emily Knapp Network Operatives Dasha Silkina Andrey Karabanov
Thanks to The Berggruen Institute and One Project for their support for the inaugural year of Antikythera.