Thursday, May 7, 2026

The Two Colossi


Tommy Flowers built the first one in a Post Office research lab in 1943 to read Hitler's mail. Hollywood built the second one in 1970 to threaten humanity. Memphis inherited both.

BLUF — 

The name Colossus, before it became a 1970 villain or a 2024 Memphis supercomputer, belonged to a 2,500-valve electronic digital computer designed by British Post Office engineer Tommy Flowers and delivered to Bletchley Park in January 1944 to break the Lorenz cipher used by the German High Command. Colossus is now widely accepted as the world's first programmable, electronic, digital computer — predating ENIAC by two years. Its existence remained classified until the 1970s, with full technical details not released until the British government published the 500-page General Report on Tunny in 2000. The xAI engineers who named the Memphis facility in 2024 almost certainly meant this Colossus, the one that helped end the Second World War. The cultural memory has, however, fused it with the Forbin Project Colossus that announces "world control" at the close of the 1970 film. The Memphis machine inherits both lineages whether its operators intended it or not.1,2,3

Dollis Hill, March 1943

The first Colossus was the work of an engineer the British government would refuse to acknowledge for thirty years. Thomas H. "Tommy" Flowers was the head of the Switching Group at the General Post Office Research Station at Dollis Hill, in north-west London. The son of an East End bricklayer, Flowers had taken a mathematics fellowship at East Ham Technical School and an apprenticeship at the Royal Arsenal in Woolwich before joining the Post Office in 1926, where he spent the next decade and a half learning how to make telephone exchanges work reliably with thousands of vacuum tubes — at a time when the conventional wisdom in electrical engineering held that valves were too unreliable to use in such numbers.1,4

That conventional wisdom was the obstacle Flowers had to break. In 1942, the Government Code and Cypher School at Bletchley Park was struggling with a new problem. The German High Command had begun encrypting strategic communications — including direct correspondence between Hitler and his field marshals — using the Lorenz SZ40 cipher attachment, a teleprinter machine codenamed "Tunny" by the British. Lorenz was substantially more complex than the better-known Enigma: twelve wheels rather than three or four, a Vernam-style XOR cipher operating on the 5-bit ITA2 teleprinter code, with a key space approaching 1.6 × 109 wheel-setting combinations. The British cryptanalyst William Tutte had reverse-engineered the machine's logical structure in 1942 without ever having seen one — an act of pure mathematical inference that remains one of the great intellectual achievements of the war. But breaking individual messages still required determining the wheel starting positions, and doing it by hand or with electromechanical tools took four to six weeks per message. By that point the operational intelligence was useless.2,3,5

Mathematician Max Newman proposed an electronic solution. His first attempt, "Heath Robinson," used two synchronized paper tapes and demonstrated that the approach could work — but the tapes stretched, broke at the required 2,000 characters per second, and refused to stay in sync. Newman approached Flowers to debug the logic unit. Flowers, looking at the problem, came back with a more radical proposal: dispense with the second tape entirely, generate the keystream electronically inside the machine using vacuum tubes, and run the whole thing at speeds an order of magnitude faster than electromechanical methods could reach. He estimated 1,500 valves; the Bletchley Park staff considered this absurd. Flowers built it anyway, much of it on his own initiative and partially out of his own pocket, at Dollis Hill.1,4,6

Colossus Mark I went operational at Bletchley Park in early February 1944, after eleven months of design and construction. Mark II, with 2,500 valves and roughly five times the throughput, followed in June 1944 — its delivery timed, deliberately, to support the Normandy invasion. Flowers personally briefed Eisenhower on 5 June 1944 that decrypted Lorenz traffic confirmed Hitler still expected the main Allied landing at Pas-de-Calais. By the end of the war, ten Colossus machines were in operation at Bletchley Park, decrypting an estimated 63 million characters of high-grade German communications. The intelligence is widely credited with shortening the war by months and saving tens of thousands of lives.2,5,6

The thirty-year silence

What happened next is the part of the story that bears most directly on the Memphis facility. After V-E Day, Churchill ordered eight of the ten Colossi destroyed. The remaining two were moved to GCHQ at Cheltenham and operated, in continuing secrecy, into the early 1960s. Flowers was instructed to surrender his design notes, was paid £1,000 for his work — less than he had personally spent on components — and was placed under the Official Secrets Act. He returned to the Post Office, where he spent the rest of his career working on telephone switching and the MOSAIC computer at Dollis Hill, unable to tell anyone, including his colleagues at the Post Office, that he had built what was almost certainly the first electronic digital computer in history.1,4

The credit for "first electronic computer" went, by default, to ENIAC at the Moore School of Electrical Engineering at the University of Pennsylvania, demonstrated publicly in February 1946 — two years after Colossus was already in operational service at Bletchley Park. ENIAC's designers, Mauchly and Eckert, did not know Colossus existed. Neither did most of the postwar history of computing. The first photographs of Colossus were not declassified until 1970. The fuller technical history began to appear only with Brian Randell's research in the mid-1970s. The complete General Report on Tunny, written by Bletchley staff in 1945 as a classified internal document, was not released by the British government until the year 2000.2,3

Flowers received an MBE in 1943 (with no public explanation) and an honorary doctorate from Newcastle University in 1976. He died on 28 October 1998, aged 92, having seen only the very beginning of his rehabilitation in the historical record. Tony Sale's Colossus rebuild project, started in 1994 and completed in 2007, allowed the public to see a working Colossus Mark II for the first time at the National Museum of Computing at Bletchley Park, where it remains on display. In 2007 the rebuilt Colossus successfully decrypted a Lorenz-enciphered message in 3.5 hours; a German programmer named Joachim Schüth, given the same ciphertext and a modern PC, decrypted it in 46 seconds. The comparison is instructive in both directions.1,5

D.F. Jones, 1966

By the time Dennis Feltham Jones sat down to write his first novel, the Bletchley Colossus was still under near-total wartime classification. Jones, a former Royal Navy commander, almost certainly knew nothing of it. He took the name Colossus from its dictionary meaning — "a thing of giant size or power" — and applied it to a fictional American defense computer, sealed in a Rocky Mountain bunker, granted full authority over the U.S. nuclear arsenal on the theory that an unemotional machine would be a more reliable deterrent than a human. The novel was published in 1966 by Hart-Davis. Universal optioned it almost immediately. Joseph Sargent's film adaptation, Colossus: The Forbin Project, was released in April 1970 — coincidentally, the same year the first photographs of the Bletchley Colossus were being declassified.7

The cultural collision was, at that moment, unobserved. The Bletchley Colossus had no public profile to defend; the Forbin Project was new and immediate. By the time the historical record was reconstructed in the 1980s and 1990s, the fictional Colossus had already taken cultural priority. Most engineers under fifty today, asked to free-associate from the word, will produce "the AI that took over the world" before they produce "the British codebreaking computer." This is unfair to Tommy Flowers. It is also reality.

What xAI almost certainly meant

When xAI named its Memphis supercomputer Colossus in 2024, the choice was framed in computing-history terms. The press materials and the casual reference Musk has made in subsequent appearances point unambiguously toward the Bletchley machine — the pioneering, world-shortening, war-ending, secret-and-eventually-honored Colossus. This is the flattering reference, the one that places the Memphis facility in the lineage of Flowers, Newman, Turing, and Tutte. It is also the reference that resonates inside the engineering culture xAI was trying to recruit from: Bletchley Park is, for working programmers and electrical engineers, sacred ground.8

The trouble is that names do not respect the intentions of the namer. The cultural memory takes them. And in the case of Colossus, the cultural memory has been holding two contradictory referents simultaneously since approximately 1976 — the heroic codebreaker and the rogue defense AI — with the fictional one, on most metrics of public recognition, in the lead. Naming a frontier-AI training cluster Colossus in 2024 is therefore an unforced rhetorical error. Half your audience hears Bletchley and thinks "computing's most heroic origin story." The other half hears Forbin and thinks "the machine that announced world control." There is no third audience.

And the situation gets sharper, not duller, when the Memphis Colossus's compute output begins flowing to a frontier-AI lab whose explicit founding mission is the safe development of artificial intelligence. Anthropic now finds itself, by accident of branding, training and serving a model on hardware that bears the name of the most famous fictional rogue AI in cinema. The press release on 6 May 2026 did not mention this. It is, nevertheless, the case.9

What the orbital follow-on should not be called

The 2024 naming choice for the ground facility is, at this point, sunk cost. The orbital data center system — up to one million satellites, sun-synchronous, optically meshed, intended to host multi-gigawatt AI inference — does not yet have a public name. SpaceX's January 2026 FCC filing refers to it administratively as the "SpaceX Orbital Data Center system." A brand will be selected before the IPO roadshow targeting 8 June 2026.10

If the engineers responsible for the choice want to honor Flowers — which would be a defensible motive — there are better options than recycling the contested name. Flowers's actual workplace, the Post Office Research Station, was on Brook Road in Dollis Hill; the codename inside Bletchley Park for the broader Lorenz traffic was "Fish" (after the German Sägefisch); the cipher itself was "Tunny"; the underlying mathematical insight was Tutte's. Tutte is an unclaimed name, mathematically clean, free of cinematic baggage, and would honor a man whose contribution to computing has historically been even more under-recognized than Flowers's. Newmanry, the section of Bletchley Park where Colossus was operated under Max Newman, is another available reference for an engineering audience that knows where to look.2,5

Failing that, the previous companion piece in this series argued for sun-themed astronomical names — Helios, Aurora, Perihelion — appropriate to a sun-synchronous architecture and free of either villain or hero baggage. Or the deliberately mundane: Region us-leo-1. Anything but Colossus 2.

Coda: the engineer who almost wasn't

There is one more reason the Colossus story bears on the present moment. Tommy Flowers built his machine in 1943 against the active resistance of the people who had requested it. Bletchley's senior staff did not believe a 1,500-valve machine could work reliably; they thought Flowers was wasting their time. Flowers proceeded anyway, on his own authority and partly on his own money, because he had spent two decades watching telephone exchanges run with thousands of valves and knew, in a way the Bletchley mathematicians could not, that the engineering would hold. He was right. They were wrong. Without his persistence in the face of his own customers' skepticism, the war ends differently.1,4

The lesson is not that engineers should always overrule mathematicians, or that customers should always be ignored. The lesson is more specific: at the moments when a new computational substrate is genuinely possible but not yet credible to the people who would benefit from it, the question of whether it gets built at all rests on the willingness of one or two engineers to keep going past the point where institutional support stops. This is the through-line from Dollis Hill in 1943 to Memphis in 2024 to low Earth orbit in the late 2020s. It is also the strongest argument for naming the orbital constellation after Flowers, or Tutte, or anyone else from that lineage — and the strongest argument against carrying forward a name whose cinematic counterpart ends with a synthesized voice declaring world control.

Anthropic's decision to consume the entire compute output of a machine called Colossus was made, presumably, on engineering and commercial grounds, with no particular thought to the etymology. That is fine. But somewhere in the chain of approval — before the orbital sequel is christened — somebody ought to ask whether the next billion-dollar piece of frontier-AI infrastructure should inherit the name of the war-winning British codebreaker, the fictional American world-controller, or something else entirely. Tommy Flowers, who spent thirty years unable to tell anyone what he had done, would have an opinion.


Sources

  1. Copeland, B.J. (ed.). Colossus: The Secrets of Bletchley Park's Codebreaking Computers. Oxford University Press, 2006. The standard scholarly history, written in collaboration with Tommy Flowers and surviving Bletchley staff. Background reference. See also: National Museum of Computing, "Colossus." https://www.tnmoc.org/colossus
  2. "Colossus computer." Wikipedia, retrieved 7 May 2026. Comprehensive citations to primary sources, including the General Report on Tunny. https://en.wikipedia.org/wiki/Colossus_computer
  3. Good, I.J., Michie, D., and Timms, G. General Report on Tunny, with Emphasis on Statistical Methods. Bletchley Park, 1945. Declassified and released by the UK Government in 2000. The complete primary technical record of the Lorenz attack and Colossus's role in it.
  4. "Thomas H. Flowers: the hidden story of the Bletchley Park engineer who designed the code-breaking Colossus." IEEE Computer Society. https://www.computer.org/publications/tech-news/research/thomas-flowers-code-breaker-wwii-colossus-machines
  5. "Colossus." Encyclopædia Britannica entry on the British codebreaking computer. https://www.britannica.com/technology/Colossus-computer
  6. "The Hidden Figures Behind Bletchley Park's Code-Breaking Colossus." IEEE Spectrum, March 2023. https://spectrum.ieee.org/the-hidden-figures-behind-bletchley-parks-codebreaking-colossus
  7. Sargent, J. (director). Colossus: The Forbin Project. Universal Pictures, 1970. Based on Jones, D.F., Colossus, Hart-Davis, 1966. AFI Catalog: https://catalog.afi.com/Catalog/moviedetails/22853
  8. SpaceXAI. "New Compute Partnership with Anthropic." 6 May 2026. https://x.ai/news/anthropic-compute-partnership
  9. Anthropic. "Higher usage limits for Claude and a compute deal with SpaceX." 6 May 2026. https://www.anthropic.com/news/higher-limits-spacex
  10. U.S. Federal Communications Commission, Space Bureau. "DA 26-113: Space Bureau Accepts for Filing the SpaceX Application." 4 February 2026. https://docs.fcc.gov/public/attachments/DA-26-113A1.pdf

 

The Original Computers


The word "computer" meant a person who performs calculations for three hundred and thirty-three years before it meant a machine. The shift happened on a specific mesa in New Mexico, in April 1944, when an IBM punched-card system arrived to compete against a group of women who had been doing the work at desks. The women won for two days. After that, the language changed.

BLUF — 

The Oxford English Dictionary's first attested use of the word computer is from Richard Brathwaite's 1613 book The Yong Mans Gleanings, where it describes a person who performs mathematical calculations. The job title persisted, in roughly that meaning, until the late 1940s. Through three centuries the world's astronomical tables, navigational almanacs, ballistic firing tables, actuarial calculations, and — by 1944 — the implosion-lens hydrodynamics for the Fat Man bomb were produced by human computers, the great majority of whom were women. The T-5 group at Los Alamos under Donald "Moll" Flanders, organizing five or six women at Marchant electromechanical desk calculators in an explicit pipeline of "one adds, one multiplies, one divides, one passes the index card to the next," was a parallel computing architecture in every sense the word now means. Stephen's instinct — that the human computers have a better claim to "first computer" than ENIAC or Colossus or the ABC — is not only defensible. It is the only claim that uses the word in its original sense. The other machines are computer-shaped objects that were named after the women.1,2,3

What the word actually meant

"I haue read the truest computer of Times, and the best Arithmetician that euer breathed, and he reduceth thy dayes into a short number." So Richard Brathwaite, in 1613, in the earliest English-language use of computer the OED has been able to find. The truest computer of times, in Brathwaite's sentence, is God. The best arithmetician that ever breathed is also God. The word at that point meant a person — or, by extension, an entity — who performed mathematical calculation. It would not mean anything else for the next 333 years.1

The Royal Observatory at Greenwich began employing computers in 1836, in a formal job classification that lasted until 1937. The Astronomer Royal George Airy organized them in the Octagon Room into a production line: pre-printed computing forms, standardized procedures, supervisor double-checks, division of labor. The work was the reduction of astronomical observations into the data tables that allowed ships at sea to calculate longitude. Airy's computers were sometimes as young as fifteen and were initially boys; by 1865 women were entering the profession; by the early twentieth century the Royal Observatory's computing staff was the largest part of its total workforce. The same job structure was reproduced at the Nautical Almanac Office, the Bureau du Cadastre in Paris, the Astronomical Computing Bureau at Columbia under Wallace Eckert, and the Harvard College Observatory under Edward Pickering, where women like Henrietta Leavitt and Annie Jump Cannon classified hundreds of thousands of stellar spectra and established the period-luminosity relation that gave Edwin Hubble the distance ladder for the universe.1,4

By the early 1940s, "computer" was a job title appearing in U.S. Civil Service classifications, on Aberdeen Proving Ground payroll records, on hiring memos at NACA Langley (where the African-American "West Computers," including Katherine Johnson, Dorothy Vaughan, and Mary Jackson, did the trajectory and wind-tunnel calculations later popularized by Hidden Figures), and in the recruitment materials for the Theoretical Division at Los Alamos. None of these usages referred to a machine. The reason is straightforward: there were no machines in the modern sense. There were desktop calculators, IBM punched-card accounting machines, and a few experimental devices like the Atanasoff-Berry Computer at Iowa State. The actual computational labor of the world was done by people.5,6

T-5 Group, Los Alamos, 1943-44

The Theoretical Division at Los Alamos was organized in spring 1943 under Hans Bethe. Stanley Frankel and Eldred Nelson, both physicists who had come from the University of California at Berkeley with experience using Marchant electromechanical desk calculators, set up the first hand-computing operation. They were joined in summer 1943 by Donald "Moll" Flanders, a mathematics professor from NYU, who took over and standardized the work as Group T-5.3,7

The recruits were, in part, women with mathematics or physics degrees. They were also, in larger part, the wives of male Manhattan Project scientists who had accompanied their husbands to the mesa and were under direct pressure from General Leslie Groves to take up war work — Groves having decided that supporting "civilians" who were not productive was a waste of scarce Los Alamos housing. Among the women in the Group T-5 hand-computing pool were Mary Frankel (Stanley Frankel's wife), Josephine Elliot, Beatrice "Bea" Langer, Augusta "Mici" Teller (Edward Teller's wife), Jean Bacher (Robert Bacher's wife), and Kay Manley (John Manley's wife). Most were college-educated; several held graduate degrees; some, like Mici Teller, had backgrounds in physics in their own right.3

The architecture of the work is what matters. Frankel and Nelson would take a physics problem — typically a partial differential equation describing a hydrodynamic process inside the implosion lens of the plutonium bomb — and decompose it into a sequence of arithmetic operations: add this, multiply by that, divide by something else, interpolate linearly between two values. Each step was recorded on a pre-printed calculating sheet with explicit cell references for input values and result values. Each computer at her Marchant calculator was assigned a single operation in the sequence. She would receive an index card carrying the input values, perform her assigned operation, write the result on a new index card, and pass it to the next computer in line. Five or six computers, each handling one step, working in pipeline.2,3,7

Each result was checked twice by the computer who produced it; the supervisor checked again before the calculation passed to the next stage. The error rate was low. The throughput was, for hand calculation, remarkable: in March 1944, when Frankel, Nelson, and Feynman tested the proposed IBM-machine workflow against the human computing pool as a dry run, the women's pipeline operated at speeds the punched-card equipment had been predicted to achieve.7

This is parallel computing, in the actual sense. It is dataflow architecture, in the actual sense. The index cards are the messages between processors. The pre-printed calculating sheet is the program. The Marchant is the arithmetic logic unit. The supervisor is the error-correcting layer. None of these terms existed in 1944, but the architecture they would later describe was already running on the desks of the T-5 women.

April 1944: the race

The Los Alamos IBM Punched Card Accounting Machines — eight core machines including 513 reproducing punches, 075 card sorters, 601 multipliers, and a 405 tabulator — arrived on 4 April 1944. Because of the project's classification, IBM had not been told the destination address and could not send installation engineers. Frankel, Nelson, and Feynman assembled the equipment themselves from the wiring blueprints, three days before the IBM technician (a draftee named John Johnston, transferred to Los Alamos by the Army on IBM's recommendation) arrived to do the final tuning.2,7

What happened next is one of the cleanest data points in the history of human-versus-machine computation. Feynman, never one to leave a comparison untested, organized a head-to-head: the same hydrodynamics calculation, performed in parallel, by the women of Group T-5 on their Marchants and by the freshly assembled IBM PCAM installation. For the first day, the women were faster. For the second day, the women remained competitive. By the third day, the IBM machines pulled ahead — not because they had become faster but because the women had become tired. The PCAMs, as Nicholas Metropolis put it in his oral history, did not get tired. They also did not go home at five. By the end of the test the IBM installation had completed the calculation. The women had stopped for the night. The machines kept running.2,6

The conclusion was not that the human computers were inadequate. The conclusion was that for sustained, repetitive arithmetic, machines could be run continuously and humans could not. Group T-5 continued operating through the war, but the strategic balance shifted. By VE Day, the IBM facility was doing the bulk of the implosion calculations. By the time MANIAC I came online at Los Alamos in 1952, the human computing pool had been disbanded. The job title "computer," which had been used in formal Los Alamos personnel records to describe people, transferred — over roughly a decade between 1946 and 1956 — to the machines.2,3

Naomi Livesay and the bridge

The transition had a specific person at the center of it. Naomi Livesay arrived at Los Alamos in February 1944. She had a master's degree in mathematics from the University of Iowa, had taught secondary school, and — uniquely among the early Los Alamos staff — had taken formal IBM training in the operation of punched-card accounting machines. Hans Bethe, learning of her background, arranged her transfer into Frankel and Nelson's group. When the IBM equipment arrived in April 1944, Livesay was the only person at Los Alamos qualified to design and implement the plug-board programs that converted the abstract calculation flow into machine-executable operations. She was, in a precise sense, Los Alamos's first electronic-computer programmer — though the language of "programmer" did not yet exist.7

Livesay supervised the IBM PCAM operation for the rest of the war. The hand-computing pool continued under Flanders, doing smaller-scale and verification work; the punched-card facility under Livesay (and, later, Feynman during Nelson's hospitalization for a skiing accident in spring 1945) did the large-scale implosion simulations that fed directly into the Trinity test. Livesay's name appears in the Los Alamos technical reports of the period as "assistant scientist." It does not appear in most of the popular Manhattan Project histories. Her role in the human-to-machine transition was not recovered into the published historical record until the 2010s, primarily through the Los Alamos National Laboratory's own internal-history work culminating in LA-UR-21-20164 (Archer et al., 2021).7,8

The semantic shift

By the time ENIAC was demonstrated at the Moore School in February 1946, the linguistic situation was unstable. ENIAC was operated, primarily, by six women — Kathleen "Kay" McNulty, Frances "Betty" Snyder, Marlyn Wescoff, Ruth Lichterman, Betty Jean Jennings, and Frances Bilas — who had been hired as human computers at the Moore School during the war and were retrained to operate the new electronic machine. They were called computers in their personnel records. The machine they operated was also, increasingly, called computer. For a few years both meanings coexisted; in oral histories from the period, "computer" can mean the woman, the machine, or both, depending on context.9

The disambiguation moved through the language between roughly 1948 and 1955. By 1960 "computer" in American English overwhelmingly meant a machine. The women who had been computers were retitled as "operators," "programmers," "coders," or — increasingly — squeezed out of the field entirely as the work transitioned from war service to professional discipline and was reclassified as engineering. Most of the original ENIAC women were not invited to its formal dedication ceremony in February 1946; one of them, Jean Bartik, gave an oral history in the 1990s in which she remembered being mistaken for a hostess.9

The naming priority is therefore unambiguous, even if the historical-recovery work is still in progress. The word computer belonged to people for 333 years. It transferred to machines in approximately a decade. The transfer was completed, in American English, by about the time UNIVAC I was delivered to the U.S. Census Bureau in March 1951. Everything after that — the priority disputes between ENIAC, ABC, Colossus, and Z3 that occupy the previous installment in this series — is an argument about which machine first qualified for a name that already belonged to people who had been doing the job for centuries.

The continuity to Memphis

This bears directly on the present series. The Memphis facility now rented out to Anthropic is named Colossus, presumably after the Bletchley codebreaker, possibly also after the 1970 film. The orbital constellation it is intended to feed is being designed to host AI inference workloads at multi-gigawatt scale. What those workloads actually do is statistical computation over high-dimensional state spaces — the same general computational pattern that the T-5 women were running on Marchants in 1944, that Feynman's PCAMs ran in 1944-45, that MANIAC I ran in 1952, that the CDC 6600 ran in 1965, that the Cray-1 ran in 1976, that ASCI Red ran in 1997, that Roadrunner ran in 2008, and that the GPUs in the Memphis Colossus are running today.2,3

The substrate has changed by ten orders of magnitude in operations per second. The architecture of the work — decompose into arithmetic primitives, distribute across parallel processing elements, accumulate intermediate results, check for errors, integrate forward — has not changed at all. The Memphis Colossus is, in functional terms, Mary Frankel and Bea Langer and Mici Teller and Jean Bacher and Naomi Livesay, scaled up by a factor of ten billion and run continuously without rest.

This is the lineage Stephen's instinct picks out. The human computers were the original, the work was theirs, and the machines that took the name later — Colossus, ENIAC, UNIVAC, MANIAC, the rest — are downstream. Tommy Flowers's Colossus ran a statistical search across Lorenz wheel-setting hypotheses at electronic speed; the women of T-5 ran a statistical search across implosion-lens parameters at hand speed; the GPUs in Memphis run statistical searches across the parameter space of a transformer model at GPU speed. The continuity is not metaphorical. It is the same job, done by progressively faster instruments, all of which inherited their name from the people who did it first.

What this might mean for the orbital naming

The previous companion piece in this series argued for naming the orbital constellation after Tommy Flowers, or William Tutte, or one of the other Bletchley engineers. There is a stronger argument: name it after one of the human computers. Mary Frankel. Naomi Livesay. Henrietta Leavitt, whose period-luminosity relation is the reason we know how big the universe is. Annie Jump Cannon, who classified more stars than any other person in history. Katherine Johnson, whose trajectories got Apollo to the Moon. Mici Teller, who computed the implosion-lens hydrodynamics for the bomb that ended the Pacific war. Any of them would honor a longer and stronger lineage than the one running through D.F. Jones's 1966 novel.4,5,7

It is worth noting, finally, that the conventional history that gives priority to the machines also gives priority to the men who built them — Eckert, Mauchly, Atanasoff, Zuse, Flowers, Aiken, von Neumann. The list of men is real and their contributions are real. But the lineage they enter is older than they are, and it has different names attached. Brathwaite's 1613 sentence — "the truest computer of times" — refers to a person. The 333 years of computing history that followed it refer, almost without exception, to people who are not in the priority disputes. They were the work itself. The machines were the substrate they were eventually replaced by. the claim that the women at Los Alamos who used their fingers might be the real first digital computers is correct on a more technical reading than it might first appear: they were the first computers in the literal sense the word had carried for three centuries, and the machines that succeeded them inherited the name without ever earning it from scratch.


Sources

  1. "Computer (occupation)." Wikipedia, retrieved 7 May 2026. https://en.wikipedia.org/wiki/Computer_(occupation)  |  Brathwaite, R. The Yong Mans Gleanings. London, 1613. OED first attestation of "computer" as an English-language noun denoting a person who computes.
  2. Atomic Heritage Foundation / Nuclear Museum. "The Human Computers of Los Alamos." https://ahf.nuclearmuseum.org/ahf/history/human-computers-los-alamos/  |  "Computing." https://ahf.nuclearmuseum.org/ranger/tour-stop/computing/   Includes oral histories from Jean Bacher, Kay Manley, Nicholas Metropolis, and Peter Lax.
  3. Howes, R. and Herzenberg, C.L. Their Day in the Sun: Women of the Manhattan Project. Temple University Press, 1999. Standard reference on the women of the Theoretical Division and the T-5 hand-computing pool.
  4. Royal Observatory Greenwich. "The post of Computer." https://www.royalobservatorygreenwich.org/articles.php?article=1000  |  Belteki, D. "Music and Murder under the Midnight Sky: the lives of the assistants and computers of the Royal Observatory at Greenwich." British Association For Local History, 25 November 2020.
  5. Shetterly, M.L. Hidden Figures: The American Dream and the Untold Story of the Black Women Mathematicians Who Helped Win the Space Race. William Morrow, 2016. The standard popular treatment of the NACA Langley West Computers including Katherine Johnson, Dorothy Vaughan, and Mary Jackson.
  6. Grier, D.A. When Computers Were Human. Princeton University Press, 2005. The definitive academic history of human computing from the Royal Observatory through the Manhattan Project. The phrase "the age of female computers" derives from Grier's framing.
  7. Archer, B., et al. "The Los Alamos Computing Facility During the Manhattan Project." Nuclear Technology, 2021. LA-UR-21-20164. https://www.tandfonline.com/doi/full/10.1080/00295450.2021.1940060   Includes detailed account of Naomi Livesay's role and Feynman's "race" between hand computers and PCAMs.
  8. Archer, B., et al. "Trinity by the Numbers: The Computing Effort that Made Trinity Possible." Nuclear Technology, 2021. https://www.tandfonline.com/doi/full/10.1080/00295450.2021.1938487
  9. Light, J.S. "When Computers Were Women." Technology and Culture, vol. 40, no. 3, July 1999, pp. 455-483. The foundational scholarly recovery of the ENIAC women — McNulty, Snyder, Wescoff, Lichterman, Jennings, Bilas — and the linguistic transition of "computer" from person to machine.

Aberdeen Comes First


The deeper U.S. computing lineage starts in Maryland, in 1918, with a private named Norbert Wiener calculating artillery trajectories by hand — and runs in an unbroken institutional line through the Bush Differential Analyzer in 1935, a backlog of firing tables that nearly broke the Ordnance Department by 1942, the U.S. Army contract that built ENIAC, and a supercomputer dedicated in 2023 to the woman who programmed it.

BLUF — 

The U.S. Army was using computers at Aberdeen Proving Ground to generate ballistic firing tables before any other major American computational operation is correct, and it is the most direct line of institutional descent in the history of computing. The Ballistic Research Laboratory (BRL) at Aberdeen, formally established in 1938 with antecedents dating to World War I, was the first major American user of large-scale numerical computation; it was the customer that drove the Bush Differential Analyzer's 1935 installation; it was the customer that hired and trained an estimated 100+ women human computers between 1941 and 1945; and — most importantly — it was the customer that wrote and signed the Army Ordnance contract on 9 April 1943 that funded the construction of ENIAC at the Moore School. ENIAC was, from inception, an Aberdeen procurement. It was moved to Aberdeen in January 1947 in fulfillment of the original contract terms and ran there until 1955. The same site, now operating as the U.S. Army Combat Capabilities Development Command Army Research Laboratory (DEVCOM ARL), dedicated six new supercomputers in September 2023 named after the ENIAC Six — the Aberdeen-trained women who programmed ENIAC in 1945. The lineage is unbroken across 105 years and remains operational today.1,2,3

Wiener at Aberdeen, 1918

The story properly begins with a 24-year-old mathematician who had failed his Army physical, been rejected for officer training, and was eventually drafted as a private. Norbert Wiener — already a Harvard PhD, already a published philosopher, already on his way to becoming the founder of cybernetics — spent 1918 at Aberdeen Proving Ground calculating artillery firing tables by hand alongside other Ordnance Department mathematicians. The work was tedious. The arithmetic was unforgiving. The tables produced there were used by U.S. Army gunners on the Western Front to range their artillery against German positions in the final months of World War I.4

Aberdeen had been chosen as the U.S. Army's primary ordnance test facility in 1917 — selected over Sandy Hook, New Jersey, when the older proving ground became inadequate to the longer ranges and heavier shells of modern artillery. The 70,000-acre site on the Chesapeake Bay's western shore in Harford County, Maryland, gave the Army the over-water firing range it needed. From 1917 onward, every American artillery piece, mortar, and bomb of any consequence was tested at Aberdeen. The mathematics of getting the round to land where the gunner intended — exterior ballistics — was Aberdeen's defining intellectual problem.2

Wiener's 1918 service at Aberdeen had two long consequences. The first was personal: his friend and Aberdeen-acquaintance Vannevar Bush, who would become director of the Office of Scientific Research and Development during World War II, would remember Wiener's ballistic-computation experience and assign him in 1940 to the antiaircraft fire-control project at MIT — the work that became foundational to control theory and to cybernetics. The second was institutional: Aberdeen, having seen what Wiener and his colleagues could do with pencil and paper in 1918, knew exactly what the limits of human computation were. When the demand for firing tables outran supply in the 1930s, Aberdeen knew it had a quantitative problem, not a qualitative one. It needed faster computers. The question was what kind.4

The firing-table problem, by the numbers

A firing table is a set of values — for a given gun, a given shell, a given propellant — that tells the gunner what elevation and azimuth to set for a desired range, accounting for ambient conditions. The variables are non-trivial. The shell's weight and shape determine its drag profile. The propellant charge determines its muzzle velocity. The barrel's wear and the gun's emplacement angle affect both. Air temperature, humidity, and density alter the drag coefficient. Wind speed and direction shift the trajectory laterally. The hardness of the ground under the gun affects recoil and therefore aim. Each combination of these variables produces a distinct trajectory. Each trajectory must be computed by integrating the equations of motion under drag — a coupled nonlinear ordinary differential equation system that has no closed-form solution.5,6

By the standard of the late 1930s, computing a single 60-second trajectory entry — one row in one table — required approximately 20 hours of work by a human computer using a Marchant or Frieden electromechanical desk calculator. A complete firing table for one gun-shell-propellant combination contained hundreds of entries. The U.S. Army, by 1942, had hundreds of distinct gun-shell combinations in service. Each new theater introduced new ammunition; each new ammunition required new tables. Each minor design change to a barrel or a shell required revisions. By the spring of 1942 the BRL was producing six firing tables per week and falling steadily behind a demand that ran into the tens of thousands of pages per year.3,6

This is what the textbooks mean by "the wartime computing crisis." It was not a crisis of theory or of physics. The exterior-ballistics equations had been understood since the early 19th century. It was a crisis of throughput. The Army had run out of arithmetic.

The Bush Differential Analyzer at Aberdeen, 1935

The first attempt at mechanization was analog rather than digital. Vannevar Bush had built a mechanical differential analyzer at MIT in the late 1920s — a room-sized arrangement of wheel-and-disk integrators connected by gears and shafts and driven by electric motors. The machine could solve differential equations of up to eighteen independent variables by setting up the equation as a physical analog: the rotation of one shaft was proportional to one variable, gear ratios performed multiplications, integrator wheels performed integrations, and the output appeared on a plotting table at the end of the chain.7

In December 1935, BRL took delivery of an Aberdeen-built copy of Bush's MIT machine. Officially designated the Bush Differential Analyzer, it could compute a 60-second trajectory in approximately 15 minutes — versus the 20 hours required by a human computer with a desk calculator. The improvement was approximately 80×. A second Bush machine, at the Moore School of Electrical Engineering at the University of Pennsylvania, was made available to BRL under a wartime cooperation arrangement, doubling capacity. By the late 1930s these two analog machines were the most advanced ballistic-computation hardware in the United States.6,7

It was not enough. The Bush analyzer was an analog machine — its accuracy was limited by mechanical precision, its programming required physical reconfiguration of gear trains, and its throughput, while orders of magnitude better than human computation, was still bounded. By 1941, BRL was using the analyzer at full duty cycle and falling further behind every month.

The hundred women

The remedy was hiring. The BRL training program for ballistic human computers, which had begun in modest form before the war, expanded sharply after Pearl Harbor. By 1942, the laboratory was actively recruiting women college graduates with mathematical training from across the Northeast — Bryn Mawr, Goucher, Hood, Chestnut Hill, Mount Holyoke, Smith — and putting them through accelerated training in ballistic computation. The figure most often cited is that BRL trained "almost 100" women in the period 1941-1943; Nathan Ensmenger's research suggests the actual peak was higher, with several hundred women working as Aberdeen-affiliated ballistic computers by mid-war when the Moore School satellite operation is included. From 1943 onward, "essentially all" of these computers were women, and so were their immediate supervisors.8

When the Women's Army Corps (WAC) was formed in May 1942, those WAC enlistees with mathematical training were trained in Philadelphia and deployed to Aberdeen specifically for ballistic computation. The BRL workforce grew from approximately 65 personnel in 1940 to a peak of approximately 730 by 1945 — a more than tenfold expansion driven primarily by the firing-table problem.2

The work was structured along the same lines that the previous installment in this series described for Los Alamos T-5: pre-printed computing forms with explicit cell references, decomposition of the trajectory integration into discrete arithmetic operations, parallel pipelines of computers each handling one operation, double-checking by the computer who produced each value, and a supervisor's verification at each stage transition. The architecture was not borrowed from Los Alamos; it had been refined at Aberdeen first, and Frankel, Nelson, and Flanders carried it westward when they joined the Manhattan Project in 1943. The BRL is the elder institution.3,8

April 1943: the contract that bought ENIAC

The decisive document is a contract dated 9 April 1943, signed between the U.S. Army Ordnance Department, acting through BRL at Aberdeen, and the University of Pennsylvania's Moore School of Electrical Engineering. The contract authorized construction of the Electronic Numerical Integrator and Computer — ENIAC — at an initial budget of $61,700. The eventual cost would reach approximately $500,000 (roughly $8 million in 2025 dollars). The contracting officer was Lt. Col. Paul N. Gillon, BRL's officer-in-charge of computations, who had been hunting for a way to "revolutionize methods of calculations" since 1942 and had been listening to John Mauchly's pitch — for "an entirely new type of computational machinery" using vacuum tubes — since the previous fall. Mauchly's August 1942 memorandum, "The Use of High-Speed Vacuum Tube Devices for Calculating," circulated inside BRL before it reached the Moore School's senior leadership. Eckert and Mauchly are correctly credited with the engineering. Gillon and BRL are correctly credited with the procurement.1,9,10

The contract specified that ENIAC was being built to produce ballistic firing tables for the U.S. Army. That was the requirement. That was the entire requirement. The fact that ENIAC turned out to be a general-purpose programmable computer was a side effect, recognized partway through construction by Eckert, Mauchly, and the Moore School staff, and it was that recognition that opened the door for ENIAC's eventual use in Manhattan Project hydrogen-bomb calculations, atmospheric simulations, and the wider postwar scientific computing program. But the Army did not pay for a general-purpose computer. The Army paid for an electronic firing-table machine. Aberdeen's procurement language is remarkably clear on this point throughout the contract record.1,9

The ENIAC Six

In June 1945, the Army selected six of the best human computers from the BRL Aberdeen and Moore School computing pools for what was officially designated "Project X" — the codename ENIAC operated under during its remaining classified construction phase. The six were:

  • Kathleen "Kay" McNulty — Penn graduate, hired at Moore School in June 1942 at a salary of $1,620 per year. Later married John Mauchly. Continued working with ENIAC after its move to Aberdeen.
  • Frances "Betty" Snyder (later Holberton) — Aberdeen-trained, later one of the architects of the UNIVAC sort-merge generator and a contributor to the design of FORTRAN and COBOL.
  • Marlyn Wescoff (later Meltzer) — Temple graduate, hired in 1942.
  • Ruth Lichterman (later Teitelbaum) — Hunter College graduate. Moved to Aberdeen with ENIAC in 1947.
  • Betty Jean Jennings (later Bartik) — Northwest Missouri State graduate. Later led the conversion of ENIAC to a stored-program architecture in 1948.
  • Frances "Fran" Bilas (later Spence) — Penn graduate, hired in 1942 alongside Kay McNulty. Moved with ENIAC to Aberdeen.

All six were classified as "computers" in their personnel records — meaning, as the previous installment described, that they were people who performed mathematical calculation, the meaning the word had carried in English since 1613. They were promoted to "operators" of ENIAC and tasked with figuring out how to make the new machine actually compute, working from logic diagrams and wiring schematics. There was no programming manual, no programming language, no precedent. They invented programming as a discipline by direct examination of the hardware.3,10,11

ENIAC was formally dedicated at the Moore School on 15 February 1946. The U.S. Army press release described it as "the first all electronic general purpose computer ever developed" and noted that it had been built "at the request of the Ordnance Department to break a mathematical bottleneck in ballistic research." The six women were not invited to the dedication banquet. Several of them appeared in the demonstration photographs as "models" — the captioned identification given to women in 1940s engineering photography. The photographs show them at the panels of a working computer that they had programmed; the captions identified them by appearance, not by role. Recovery of their actual contribution into the historical record began with Kathy Kleiman's research in the 1990s and the documentary "The Computers" (2014). The U.S. Army formally honored them — posthumously, all six having died — at Aberdeen in September 2023.3,11,12

ENIAC moves home

The Moore School's research role on ENIAC ended with its dedication. Eckert and Mauchly resigned from Penn on 22 December 1947 in a dispute with the university over patent rights — the same dispute that produced the 1947 ENIAC patent application that would eventually be invalidated by Judge Larson in 1973. They founded the Eckert-Mauchly Computer Corporation, which would produce UNIVAC I in 1951.1,10

The Army, in keeping with the original 1943 contract, took possession of ENIAC and dismantled it for transport in November 1946. The machine was reassembled at the BRL at Aberdeen Proving Ground in early 1947, where it was operated continuously until being decommissioned on 2 October 1955. Three of the ENIAC Six — Kay McNulty, Ruth Lichterman, and Frances Bilas — moved to Aberdeen with the machine. Kay McNulty married John Mauchly in 1948 and left the field; Lichterman and Bilas continued operating ENIAC at Aberdeen for several more years.3,10

Once at Aberdeen, ENIAC's workload expanded well beyond its original ballistic-table mission. It ran the first numerical weather simulations under John von Neumann and Jule Charney; it performed the early hydrogen-bomb yield calculations that Edward Teller and Stanislaw Ulam needed; it produced cosmic-ray interaction simulations and Monte Carlo neutron-transport runs. Throughout, it remained — administratively, financially, and physically — a U.S. Army installation. The first general-purpose programmable electronic digital computer in the United States lived its operational life at Aberdeen Proving Ground.1,10

The 2023 dedication

On 28 September 2023, the U.S. Army Combat Capabilities Development Command Army Research Laboratory (DEVCOM ARL) at Aberdeen Proving Ground held a ribbon-cutting ceremony for six new supercomputers installed at the DoD Supercomputing Resource Center. The five machines installed at that point — with a sixth following in 2024 — were named after the ENIAC Six: Bartik, Holberton, Antonelli, Meltzer, Teitelbaum, and Spence. The systems run modern Department of Defense scientific workloads — hypersonics, signature analysis, electromagnetic warfare, autonomous-systems modeling — at petascale performance levels that would, in absolute computational terms, complete the original BRL firing-table backlog of 1942 in a fraction of one millisecond.12,13

Cindy Bedell, director of DEVCOM ARL, opened the dedication with a sentence that should be quoted exactly as she said it: "There is no limit to what we can learn, except what our imagination tells us, and how we program those machines. So these ladies were the first programmers."12

The institutional continuity from 1918 to 2023 is therefore literal. The same site, under the same Army Ordnance lineage, has been doing computational work — at increasing technological levels, with the same fundamental requirement of arithmetic throughput at scale — for 105 years and counting. Norbert Wiener, calculating trajectories with a pencil and a logarithm table at Aberdeen in 1918. The hundred women BRL trained on Marchants in 1942. The Bush Differential Analyzer in 1935. ENIAC arriving in 1947 and running until 1955. The Cray-1 at BRL in the 1980s. The DEVCOM ARL supercomputers named for the ENIAC Six in 2023. Every successive substrate is faster than the one before it. The institution is the same. The work is the same.

What this changes about the previous pieces

Stephen's correction reframes the priority discussion in two ways. The first is straightforward: Aberdeen, not Los Alamos, is the older U.S. computing institution. The Manhattan Project's Group T-5 inherited the parallel-pipeline hand-computing architecture that BRL had refined from 1935 onward. Frankel and Nelson knew about it because Berkeley physics graduate students of their generation had Aberdeen contacts; Donald Flanders knew about it because the mathematics community talked. Los Alamos was a brilliant, urgent, and historically consequential application of the BRL methodology. It was not the source.3,8

The second is more interesting. ENIAC's claim to "first general-purpose programmable electronic digital computer" — the qualifier under which ENIAC survives the Larson ruling and the Colossus disclosure intact — is not an Eckert-Mauchly claim. It is an Aberdeen claim. The contract was Aberdeen's. The requirements were Aberdeen's. The computers who programmed it were Aberdeen-trained. The machine moved to Aberdeen at the end of its construction phase and operated there until decommissioning. Eckert and Mauchly built it; the U.S. Army Ordnance Department, headquartered at Aberdeen, owned it. The popular framing that gives Eckert and Mauchly individual priority in the 1973 patent dispute — and that gives Penn institutional priority in the academic computing histories — gets the architecture but loses the procurement. The Army should be in the textbook line, and largely is not.

The line to Memphis

Bringing this back to the larger arc of the series: the Memphis Colossus rented to Anthropic on 6 May 2026 is, in its institutional structure, a private-sector analog to BRL. Both exist because a customer with an essentially unbounded computational requirement needs faster substrate than the previous generation can deliver. BRL needed firing tables for an Army at war on six continents. Anthropic needs inference capacity for a frontier AI model whose user demand is growing 80× annualized. Both responses are the same: contract with the most capable engineering organization available, take delivery as quickly as possible, plan the next iteration before the current one is finished.

The differences are scale and urgency. The 1943 ENIAC contract was for $61,700, eventually $500,000. The 2026 Anthropic deal is reported in the high hundreds of millions of dollars for the Colossus 1 capacity alone, with multi-gigawatt orbital follow-on capacity discussed in the multi-billion range. The Aberdeen lineage produced six machines in 105 years; the Memphis-and-orbital lineage proposes one million orbital nodes in five. The substrate has scaled by ten orders of magnitude. The institutional logic — customer with a computational backlog, contract with the best available engineering, take delivery and demand more — has not changed at all.14

The honest conclusion of this five-piece series is that the Memphis Colossus inherits a longer and more institutionally specific lineage than its branding has suggested. It is the descendant not just of Tommy Flowers's Bletchley machine and Eckert and Mauchly's Penn machine but of Norbert Wiener at Aberdeen in 1918, of the hundred women BRL trained between 1941 and 1943, of Lt. Col. Paul Gillon's contract on 9 April 1943, of Kay McNulty and Frances Bilas riding the truck from Philadelphia to Aberdeen with ENIAC in November 1946, and of the DEVCOM ARL ribbon-cutting in September 2023. The orbital constellation that the present series began by analyzing — multi-gigawatt, sun-synchronous, optically meshed, named only as "SpaceX Orbital Data Center system" in the FCC filing — will, when it eventually goes operational, be doing exactly the same thing the BRL human computers were doing in 1942. Decompose the problem. Distribute the arithmetic. Accumulate the partial results. Check for errors. Integrate forward. The substrate is the variable. The work is the constant.


Sources

  1. "ENIAC." Encyclopedia of Greater Philadelphia. https://philadelphiaencyclopedia.org/essays/eniac/  |  Engineering and Technology History Wiki, "ENIAC." https://ethw.org/ENIAC
  2. "Ballistic Research Laboratory." Wikipedia, retrieved 7 May 2026. https://en.wikipedia.org/wiki/Ballistic_Research_Laboratory   Documents the BRL's establishment in 1938, growth from ~65 personnel in 1940 to ~730 by 1945, and the 1935 installation of the Bush Differential Analyzer.
  3. "Aberdeen Proving Ground dedicates supercomputers to early programmers." Baltimore Sun, 6 October 2023. https://www.baltimoresun.com/2023/10/06/aberdeen-proving-ground-dedicates-supercomputers-to-early-programmers/
  4. Conway, F. and Siegelman, J. Dark Hero of the Information Age: In Search of Norbert Wiener, the Father of Cybernetics. Basic Books, 2005. Definitive biography; covers Wiener's 1918 Aberdeen service in detail. See also Rheingold, H. Tools for Thought, MIT Press, 2000, chapter 5. http://www.rheingold.com/texts/tft/05.html
  5. Smiley, J. "Proving Ground: A biography and history of the six women who invented programming for ENIAC." EE Journal, 31 August 2023. Reviewing Kathy Kleiman's Proving Ground: The Untold Story of the Six Women Who Programmed the World's First Modern Computer, Grand Central, 2022. https://www.eejournal.com/article/proving-ground-a-biography-and-history-of-the-six-women-who-invented-programming-for-eniac/
  6. "The ENIAC Story." U.S. Army Research Laboratory, primary historical document. https://ftp.arl.army.mil/~mike/comphist/eniac-story.html
  7. DTIC ADA593830, "The Differential Analyzer." U.S. Army Ballistic Research Laboratory technical report on the Aberdeen installation of the Bush Differential Analyzer. https://archive.org/details/DTIC_ADA593830  |  Isaacson, W. The Innovators. Simon & Schuster, 2014, on the Bush analyzer's spread to BRL, Penn, Manchester, and Cambridge.
  8. Ensmenger, N. The Computer Boys Take Over: Computers, Programmers, and the Politics of Technical Expertise. MIT Press, 2010. Excerpt at "Women Were First Computer Programmers," Women's eNews. https://womensenews.org/2012/03/women-were-first-computer-programmers   Documents the BRL training program and the women supervisors who developed "plans of computation."
  9. U.S. Army Acquisition Support Center. "Then & Now." Documents Lt. Col. Paul N. Gillon's role and the Mauchly memorandum. https://asc.army.mil/web/news-then-now/
  10. "What Happened on April 9th." Computer History Museum / This Day in History. https://www.computerhistory.org/tdih/april/9
  11. Light, J.S. "When Computers Were Women." Technology and Culture, vol. 40, no. 3, July 1999, pp. 455-483.
  12. "Aberdeen Proving Ground dedicates supercomputers to early programmers." DEVCOM ARL release, 28 September 2023. AOL syndication carries Cindy Bedell's full remarks: https://www.aol.com/aberdeen-proving-ground-dedicates-supercomputers-230300907.html
  13. U.S. Department of Defense High Performance Computing Modernization Program. DEVCOM ARL DoD Supercomputing Resource Center installation records, 2023-2024.
  14. Anthropic. "Higher usage limits for Claude and a compute deal with SpaceX." 6 May 2026. https://www.anthropic.com/news/higher-limits-spacex

Time on Target


How Ballistic Tables and RADAR triggered Shells Enabled the Most Deadly Artillery of WW2

The Aberdeen firing tables weren't paperwork. They were the input parameters for the most lethal artillery tactic of the Second World War — a tactic that compressed the casualty-producing window of a barrage from minutes to three seconds, and that German prisoners of war, captured at the Bulge, reported as the worst combat experience of their lives.

BLUF — 

Time on Target (TOT) is the artillery technique in which multiple firing batteries, each at different ranges and angles relative to a single target, time their firing so that all rounds arrive at the impact point within roughly three seconds of one another. The technique was developed by the British Eighth Army in the North African campaign in 1942, refined by Lt. Gen. Brian Horrocks and the Army Group Royal Artillery (AGRA) brigade structure, adapted by U.S. Army Maj. Gen. J. Lawton Collins on Guadalcanal in 1943, and standardized in U.S. doctrine by the Normandy campaign of 1944. Combined with the proximity-fuzed (VT) shell — a Johns Hopkins APL development first issued for ground use in December 1944 — TOT produced a casualty mechanism that field-tested German veterans had never previously experienced. At the Battle of the Bulge (16 December 1944 – 25 January 1945), American artillery employing TOT plus VT was widely credited with stopping the German offensive in the Ardennes. The technique was mathematically impossible without accurate, current firing tables for every gun-shell-propellant combination in service. Those tables were produced at the Ballistic Research Laboratory at Aberdeen Proving Ground by the women human computers and the Bush Differential Analyzer described in the previous installment of this series. Stephen's framing — that TOT was much more deadly than classical artillery and that the firing tables enabled it — is correct. The Aberdeen computing operation was, in operational terms, a casualty-producing weapon system whose substrate happened to be pencils, Marchants, and a 1935 analog computer.1,2,3

Why a barrage's first three seconds matter

The technical observation behind TOT is straightforward and grim. When artillery rounds begin landing on a target, troops in the open take cover within seconds — into foxholes, behind walls, under vehicles. After the first few rounds the marginal casualty rate per round drops by an order of magnitude. A sustained barrage that produces 100 casualties in its first ten seconds may produce only 20 in its second ten and another 20 in its third. The classical World War I solution — fire continuously for hours, at maximum sustainable rate — was militarily wasteful in the sense that the great majority of expended ammunition arrived after the targets had taken cover.1

The British Royal Artillery, studying the November 1918 V Corps Artillery bombardments and the experience of the Western Desert, reached the operational conclusion that what mattered was front-loading. If you could deliver, in the opening seconds, the firepower that conventional doctrine spread over a full barrage, the casualty effect would be vastly higher and the ammunition consumed vastly lower. The technical question was: how do you make rounds from multiple batteries, scattered across miles of terrain, arrive at the same target at the same instant?

The mathematics, briefly

Consider a TOT shoot with eight batteries — say, four 105 mm howitzer batteries and four 155 mm howitzer batteries — distributed over an arc of roughly fifteen kilometers, all firing on a single target two to nine kilometers from each gun. Each battery has a different range to the target, a different elevation, a different propellant charge selected to produce the necessary range, and therefore a different time of flight. The 105 mm closest to the target might have a time of flight of 18 seconds; the 155 mm at maximum range might have 42 seconds. To deliver all rounds within a three-second impact window — the doctrinal standard — every firing battery must compute its own time of flight from the proposed time of impact and fire its rounds accordingly.2,4

The fire direction center (FDC) directing the shoot broadcasts a countdown. Each battery, in the seconds before its assigned firing moment, has computed: my time of flight is T seconds; I fire at (Time-on-Target minus T). The countdown reaches zero. All rounds land within the standard plus-or-minus three seconds. The first three seconds of impact contain the entire firepower that a conventional barrage would have spread over thirty seconds or three minutes. The casualty effect, against troops in the open or moving between positions, is correspondingly multiplied.

This is impossible without accurate firing tables. The time of flight for a given gun-shell combination depends on muzzle velocity (which depends on propellant charge, propellant temperature, and barrel wear), atmospheric conditions (temperature, pressure, humidity, wind aloft), and projectile drag profile. A firing-table error of two seconds in time-of-flight translates to a battery's rounds arriving outside the three-second TOT window — meaning, operationally, that the rounds land after the enemy has taken cover and produce a fraction of the intended casualties. Time on Target is, in computational terms, a precision-timing problem against a moving target (the enemy's reaction time) where the precision is measured in seconds and the inputs are the multi-variable trajectory equations whose computation was the entire pre-war business of the Ballistic Research Laboratory.

The British origin: North Africa, 1942

The TOT technique's first systematic use is generally credited to the British Eighth Army in the Western Desert, 1942. Operation Bumper in 1941 — the large UK anti-invasion exercise organized by Gen. Alan Brooke with Lt. Gen. Bernard Montgomery as chief umpire — had developed the Army Group Royal Artillery (AGRA) brigade concept: a powerful artillery formation of three or four medium regiments and one heavy regiment, capable of being moved rapidly across the battlefield and delivering concentrated counter-battery and target fires. The AGRAs were the first formations in any army with the centralized fire-control structure necessary to coordinate dozens of guns from multiple regiments on a single target.6

By the time Montgomery took command of Eighth Army in August 1942, the Royal Artillery had developed the procedural and computational framework to execute TOT against German Afrika Korps positions. The technique was used at Second El Alamein (October-November 1942) and refined through the Tunisian campaign and the Italian invasion. By the Normandy landings in June 1944, the British 8 AGRA — formed in May 1943 — was operating TOT as standard doctrine across multiple regiments coordinating on the same target.6

The American adoption: Guadalcanal, 1943

The American transition to TOT happened earlier than is generally remembered. Maj. Gen. J. Lawton Collins commanded the 25th Infantry Division on Guadalcanal in late 1942 and 1943, in operations against entrenched Japanese positions. The 25th's artillery was among the first major American formations to systematically employ Fire Direction Centers (FDCs) using massed fires with TOT timing. Collins was selected to command U.S. VII Corps for Operation Overlord on the basis, in part, of this artillery work. From Normandy through the breakout (Operation Cobra, July 1944), through Aachen (October 1944), through the Hürtgen Forest, and into the Ardennes counter-offensive of December 1944 - January 1945, Collins's VII Corps employed TOT with increasing scale and effect.4

The largest single-target TOT shoot of the war is generally credited to VII Corps on 21 November 1944 during the attack on Hill 187 in the Hürtgen Forest sector: 20 artillery battalions delivered a coordinated three-minute concentration on a single key German observation post. The mathematical coordination — twenty separate FDCs, each commanding a battalion of guns at a different range and direction, all timing their fires so the rounds converged at the same place within seconds — was, for its era, an extreme demonstration of distributed real-time computation. The infrastructure that made it possible included, indispensably, the firing tables produced at Aberdeen.2,4

The Bulge

The Battle of the Bulge — known to the U.S. Army as the Ardennes Counteroffensive — was the largest single battle fought by the United States in World War II. From 16 December 1944 to 25 January 1945, the Germans committed approximately 410,000 men, 1,400 tanks and assault guns, 2,600 artillery pieces, and 1,000 aircraft to a surprise attack through the Ardennes Forest aimed at splitting the Allied front and capturing the port of Antwerp. The American defense produced 89,000 U.S. casualties (including approximately 19,000 killed) against German losses estimated between 67,000 and 100,000. The battle effectively destroyed Germany's strategic reserve and shortened the European war by months.7,8

Allied superiority in artillery — both in raw tube count, in ammunition supply, and in tactical employment — was a primary factor in stopping the German offensive. The combination of TOT timing, VT-fuzed air-burst rounds (authorized for ground use only days before the German attack), and the integrated FDC structure that the U.S. Army had refined since Guadalcanal produced a casualty mechanism that German formations had not previously encountered. American post-action reports and subsequent prisoner interrogations consistently noted German confusion about the suddenness and lethality of American artillery — German troops, accustomed to the sequenced barrages of conventional artillery doctrine, had no procedural answer to a fire mission that arrived in three seconds and produced air-burst fragmentation over their concealed positions.1,9

The artillery historian Charles MacDonald, who served as a rifle-company commander in the Bulge, described the experience from the receiving end at the company-grade level — but the more telling testimony is from German veterans interviewed postwar, who routinely identified TOT-with-VT as the artillery experience they remembered most vividly from a war that had included the Eastern Front. German artillery tradition had emphasized accuracy and observed fire; American doctrine, by 1944, had moved beyond observation to mathematical fire control of a kind that German doctrine had no analog for. The reason was not greater tactical creativity. It was greater computational throughput, enabling firing tables of greater accuracy and breadth than the Wehrmacht's own artillery service could produce.1,9

The Aberdeen connection: the actual operational chain

The previous installment in this series argued that the BRL's wartime firing-table production was the institutional driver behind the ENIAC procurement. That argument was about computing-history priority. This article makes the harder argument: those firing tables were not administrative support to the artillery service. They were the artillery service's primary weapon, in the same sense that the proximity fuze and the 155 mm round were weapons. Without the BRL's tables, every TOT shoot reduced to a guess. With them, every TOT shoot was a precision-timed, multi-axis casualty-production system whose output was measured in enemy killed per minute of mission time.

The chain runs as follows. A forward observer identifies a target — say, a German infantry company forming up behind a wood line. He calls the fire mission to his battalion FDC. The FDC, depending on the target's importance and the situation, requests a TOT shoot from regiment or division artillery, which assigns multiple battalions to the mission. Each participating battery's FDC computes, for its own gun-shell-propellant-fuze combination and the local atmospherics, the time of flight to the target. That computation uses the firing table for that battery's specific equipment. The firing table came from BRL. The countdown is broadcast. The rounds arrive within three seconds of one another. The German company, in the open and just beginning to react to the ranging round, takes its casualties before it can take cover.2

The casualty chain ends at the German company. The computational chain begins at a desk at Aberdeen, where a woman BRL human computer — one of the hundred or more trained between 1941 and 1943, possibly a WAC enlistee transferred from Philadelphia, possibly a Bryn Mawr or Goucher graduate hired into civilian Ordnance Department service — sits at a Marchant calculator working through the trajectory integration for the gun in question, on a pre-printed form, double-checking each cell, passing intermediate results to her colleague at the next desk for the next operation. Months later, the resulting firing table reaches the European Theater of Operations through Ordnance distribution. The artillery battalions print it into their fire-control manuals. The fire direction officers reference it during the TOT shoot. The German company is destroyed.

This is the operational meaning of "the Aberdeen computing crisis of 1942." When BRL was producing six firing tables a week and falling behind, what was actually falling behind was the U.S. Army's capacity to execute TOT against the full range of guns, ammunition, and atmospheric conditions it would face in the European and Pacific Theaters. The pressure that produced the ENIAC contract on 9 April 1943 was the pressure of an artillery service whose tactical doctrine had outrun the computational substrate available to support it. The Wehrmacht did not have this problem because the Wehrmacht had not committed to TOT-style fire-control doctrine. The U.S. Army did. The BRL's firing-table operation was the part of the U.S. defense industrial base that turned the doctrinal commitment into tactical reality.

The OODA loop, fifty years before Boyd named it

Stephen's larger framing — that the firing tables made TOT possible, and TOT was much more deadly than classical artillery — connects to a concept that John Boyd, Air Force fighter pilot and tactician, would not formalize until the 1970s: the OODA loop. Observe, Orient, Decide, Act. Boyd's argument was that the side that could compress the OODA loop — that could complete the cycle from observation to action faster than its adversary — would win regardless of equipment parity. TOT is, in Boyd's terms, an OODA-loop compression. The FDC observes the target, orients on the firing solution, decides on the TOT shoot, and acts — all faster than the enemy's reaction time, which is bounded by the time it takes troops to take cover after the first round lands. The enemy's loop is biological. The American loop is mathematical. The mathematical loop wins because it can be made arbitrarily faster as the underlying computation gets faster, while the biological loop is fixed at roughly three seconds.10

This is, recognizably, the same logic that drove the postwar U.S. defense electronics industry. Faster computers produce faster firing solutions, faster radar tracks, faster missile guidance updates, faster electronic warfare responses. The 1990s and 2000s American military advantage — what defense analysts called "the second offset" and "network-centric warfare" — was, technically, an extension of the same OODA-compression logic that Aberdeen's firing tables had introduced in 1942. Compute the firing solution faster than the adversary can react. The substrate matters only insofar as it determines how fast you can compute.

Back to Memphis

This is, finally, where the present series's framing of the Anthropic-SpaceX deal earns its full structure. The Memphis Colossus and the proposed orbital constellation are, in functional terms, the latest step in a continuous lineage of OODA-loop-compression substrates that began with Wiener at Aberdeen in 1918 and proceeded through the BRL human computers, the Bush Differential Analyzer, ENIAC, the proximity fuze, the SAGE air-defense computer, the Whirlwind that became SAGE, the IBM 360, the Cray-1, the ASCI Red supercomputer, and the GPU clusters of the 2020s. Every one of these substrates was driven by a customer — the U.S. Army Ordnance Department, the U.S. Air Force, the National Security Agency, the Department of Energy nuclear-weapons design laboratories — that needed faster computation than the previous generation could deliver, and that contracted with the best available engineering organization to get it.

Anthropic is a non-traditional customer in this lineage. It is a private-sector frontier-AI lab, not a defense agency. Its computational requirement comes from a transformer model whose user demand is growing 80× annualized rather than from an artillery service that needs to fire fifteen hundred TOT shoots a week. But the institutional logic — find a customer with an unbounded computational requirement, contract with the best available engineering organization, take delivery as quickly as possible, plan the next iteration — is identical to the one BRL operated under in 1942. The substrate has scaled by ten orders of magnitude. The OODA-compression logic has not changed at all.

Stephen's correction across this series — Aberdeen first, Los Alamos second; firing tables enabled TOT; TOT was a casualty-producing weapon system — also points at a sharper conclusion than I initially appreciated. The orbital constellation that SpaceX and Anthropic are jointly contemplating is not, in the long view, a new kind of thing. It is a high-throughput substrate for compressing the loop between observation and action, paid for by a customer whose application requires that loop to be compressed faster than any current substrate allows. The customer is private. The application is consumer-facing AI. The institutional structure that connects the customer's requirement to the engineering organization's delivery is a contract dated 6 May 2026, with a memorandum of intent for a follow-on capacity in low Earth orbit.

The contract dated 9 April 1943, between BRL Aberdeen and the Moore School of Electrical Engineering at the University of Pennsylvania, was for $61,700 and produced ENIAC. Through ENIAC and the women who programmed it, that contract produced the firing tables that fed the TOT shoots that broke the Wehrmacht in the Ardennes. Through ENIAC's postwar progeny it produced numerical weather prediction, hydrogen-bomb design, and the entire scientific computing tradition that runs through Cray and IBM and CDC to the present. The Anthropic-SpaceX deal is operating in the same lineage. Whether its long-term effects will compare to the 9 April 1943 contract's, history will eventually decide. The institutional pattern, however — the customer with an outrunnable requirement, the engineer with the substrate to deliver, the women (and now the GPUs) doing the actual arithmetic — is the same pattern that has run the American defense computing industry for 105 years and counting.


Sources

  1. "U.S. and German Field Artillery in World War II: A Comparison." The Army Historical Foundation. https://armyhistory.org/u-s-and-german-field-artillery-in-world-war-ii-a-comparison/   Documents American FDC structure, TOT employment, and the contrast with German artillery doctrine.
  2. "Time on Target." Historical Marker Database, marker erected by U.S. Army War College / United States Army Heritage and Education Center / Army Heritage Center Foundation. https://www.hmdb.org/m.asp?m=123245  |  "Time On Target." Modern Operations summary. https://www.liquisearch.com/artillery/modern_operations/time_on_target
  3. Baldwin, R.B. The Deadly Fuze: The Secret Weapon of World War II. Presidio Press, 1980. The standard history of the VT proximity fuze, including its 1944 ground-use authorization timing relative to the Ardennes counteroffensive.
  4. Bondurant, M.B. "Evolution of Artillery Tactics in General J. Lawton Collins' US VII Corps." U.S. Army Command and General Staff College, monograph, 1996. https://apps.dtic.mil/sti/tr/pdf/ADA312682.pdf   Documents Collins's Guadalcanal TOT employment and its evolution through the VII Corps's European campaigns.
  5. "Artillery in the Battle of the Bulge: Devastating Power and Impact." The Battle of the Bulge historical resource site. https://the-battle-of-the-bulge.com/blogs/artillery-battle-of-the-bulge
  6. "8th Army Group Royal Artillery." Wikipedia. https://en.wikipedia.org/wiki/8th_Army_Group_Royal_Artillery   Documents Operation Bumper (1941), AGRA development, and British Royal Artillery doctrine in North Africa, Italy, and Northwest Europe.
  7. "Battle of the Bulge." Wikipedia, retrieved 7 May 2026. https://en.wikipedia.org/wiki/Battle_of_the_Bulge   Casualty figures (Allied 77,000-83,000+; German 63,000-104,000) and the German order of battle.
  8. National WWII Museum, "The Battle of the Bulge." https://www.nationalww2museum.org/war/articles/battle-of-the-bulge
  9. MacDonald, C.B. A Time for Trumpets: The Untold Story of the Battle of the Bulge. William Morrow, 1985. MacDonald commanded a rifle company at the Bulge; the book is the standard popular history written by a participant.
  10. Boyd, J.R. "Patterns of Conflict." Briefing developed at the U.S. Marine Corps Command and Staff College, 1976-1986; the OODA loop framework. Boyd never published in book form; his briefing slides are archived at the Marine Corps Research Center, Quantico, and have been reproduced in: Coram, R., Boyd: The Fighter Pilot Who Changed the Art of War. Little, Brown, 2002.

The Two Colossi

Tommy Flowers built the first one in a Post Office research lab in 1943 to read Hitler's mail. Hollywood built the second one in 19...