As a kid growing up in the 1980s, I was frighteningly aware of the possibility of a nuclear war with the Soviets. While films like War Games and The Day After kept an older generation on edge, I was a bit too young.
Nevertheless, one evening I saw a news report about a “nuclear exercise” that showed American troops running for cover. The terror set in. It took what seemed like hours for my parents to calm me down. Those childhood fears weren’t without reason. Nuclear near-misses like the “Able Archer” incident in November of 1983 almost led to a Soviet preemptive nuclear strike on the US. Thankfully, those worst fears never came to fruition, and we’ve lived to tell the tale.
The end of the Cold War and the Soviet nuclear threat had clear and compelling reasons. The most important is the irrationality of socialism itself. Certainly, the courageous acts of Eastern Europeans who resisted their oppressors played a decisive role, along with moral and material support from the West. At the same time, there were subtle and unsung heroes whose actions were in the pursuit of profit and innovation. Seymour Cray, “the father of supercomputing,” was just such a man.
Known as an “eccentric maverick,” Cray served as a code-breaker in the Pacific theater, breaking Japanese codes. In 1957, he founded Control Data Corporation in Bloomington, Minnesota. It was partly his personal fears about nuclear war that led him to bring the labs to his small hometown of Chippewa Falls, Wisconsin. However, after this move, management conflicts emerged, and he later broke away and formed Cray Research. It was there that his computing principles — simplicity, size, discipline, and cooling — birthed the world’s first supercomputer.

This drive for innovation led him and his team to use integrated circuits to construct the fastest computing speeds in the world. But this speed came with a tradeoff: extreme heat. In fact, the machines ran so hot that legend has it that they were used to heat the facilities during freezing Midwest winters. Once the cooling issue was solved, and production began, the first machine, the Cray-1, was “lent” to the Los Alamos National Laboratory in March of 1976. After its successful use there (simulating the impacts of nuclear blasts), it was sold to the National Center for Atmospheric Research in 1977 for $8.8 million, more than $50 million in today’s terms.
Cray’s team had expected to sell about a dozen machines, but demand was higher than anticipated. By the time the Cray-2 was developed, they had sold more than 80, a smashing business success if there ever was one!
Alongside the tremendous success of this entrepreneurial venture, Cray’s creative genius can’t be overstated. When asked what sort of CAD program he used to design it, he replied, “#3 pencils with quadrille pads.”
Cray’s commitment to simplicity and intuitive design was perhaps a manifestation of the importance of miniaturization, the cornerstone of modern computing success, portability, and mass production.
In 1965, Gordon Moore observed that the number of transistors on a microchip doubles in about two years. This claim gave rise to what’s known as “Moore’s Law.” Simply put, the more transistors, the greater the computing power. Cray harnessed this to great effect with his supercomputer. Further, the success of his machine was largely due to its lightning-quick 160 million floating-point operations per second “MFLOPS” with memory of 32 megabytes. By way of comparison, an average laptop from 2021 ran up to 2,671 times faster!
This kind of success in business and innovation had untold cultural effects, not just for how we live and work, but for the de-escalation of the threat of nuclear war at the end of the twentieth century.
While it might be a stretch to claim that the use of the first Cray-1 directly led to fewer nuclear tests by the US government, it coincided with a steady decline in testing at Los Alamos. The weapons testing program curtailed explosions from twenty down to fourteen per year in 1980. The importance of reduced testing is known to have reduced tensions between the superpowers at the time, since the escalation of weapons testing signals a threat to adversaries, just as it does today. Despite that four-year trend, testing then mildly accelerated under the Reagan administration. Indeed, there are some who give credit to this nuclear saber-rattling as a rope-a-dope tactic to the collapse of the USSR. Meanwhile, others, including Soviet insiders, have their doubts about that hypothesis, recognizing the weakness of the Soviet economy as its fatal flaw. Whatever the ultimate reasons, the world breathed a sigh of relief as nuclear tensions were dramatically reduced once Soviet socialism and its nuclear threat were evidently put to rest.
With the end of the Cold War, the 1990s saw a new wave of freedom sweep the Eastern bloc, unleashing dramatic, if uneven, economic growth and liberalization. A key part of the recipe has been investment in computing power, not just in the former socialist states, but all across the globe. With ongoing increases in computing speed, work productivity has increased, and at an ever-decreasing cost. Advances in computing power have made households and firms alike more productive and connected than ever and spurred the creation of a digitized culture.

For better or worse, this fundamental shift in geopolitics, the nuclear threat, and in the way we live, work, and move have been influenced by the exponential growth in computing power. In light of these benefits and their tradeoffs, Seymour Cray and his entrepreneurial judgment and risk-taking shaped our world in ways that few would have expected. And at least for this Cold War kid, it allows my kids to sleep a little better at night than I did, unless, of course, they’re awake and using their handheld supercomputers.
Whenever people hear of the Los Alamos National Lab, they think of J. Robert Oppenheimer, the Manhattan Project, further atomic weapons testing, and broader scientific and computing experimentation, all brought to you by the US federal government. However, the development of the first supercomputer was first conceived and executed by a private firm, Cray Research Incorporated, and its founder, Seymour Cray.
