Designing Certainty: The Rise of Algorithmic Computing in an Age of Anxiety 1920–1970
This dissertation offers a political history of the cultural trope and technical apparatus: ‘with 95% certainty,’ and of uncertainty more broadly, from the early 1920s mathematical statistics movement through the design of FORTRAN and ALGOL language digital algorithms of the 1960s and 1970s. The work features a prominent twentieth-century data architecture: confidence interval parameters (CIs). Confidence intervals are statistical hypothesis tests, and experimental design mechanisms, used to make estimations about statistical data, and inform subsequent decision-making based on that information and analysis. CIs connect across digital and predigital computing and function as part of the underpinning logical and political infrastructures that make algorithmic decision-making possible. I situate digital algorithms and statistical hypothesis tests as common ‘data architectures,’ that operate under uncertainty (probabilistic thinking), and that are designed to make certainty claims (political decisions) based on a set of information. By the 1960s, digital algorithms were designed to take over the (un)certainty work of human computers.
At the scale of experimental data design, there are key computing concepts at work: confidence (measure of validity), control (randomization), and uncertainty (probability limits) that hold technical-mathematical meanings. I argue these computing concepts also hold affective meanings, driven by human desires and anxieties. I link historical instances and applications of CI logics, a practice that I term ‘confidence computing,’ with much larger historical forces in agriculture, militarism, and environmental policy. I follow iterations of CI logics across a hundred-year period, and in global applications in Poland, India, England, the United States, and Navajo and Hopi land. I put forward two analytic categories to connect across these contexts: ‘(un)certainty work’ is the twofold process of translating data into probabilistic information and analysis and making certainty claims based on that information and analysis. And ‘computing landscapes’ are the geographical areas of land, and political and cultural contexts, that are altered and transformed through this computing work.
I argue this: Between 1920 and 1970 an information transformation occurred that reconfigured economic, scientific, and environmental planning processes under a shared program to command uncertainty in data management. This information movement is driven by iterations of crisis that begin in the aftermath of WWI. Designations of crisis are generative of new technical (un)certainty designs and new information systems just as they reaffirm extant information and power structures.Waves of crisis and responsive computational design (and redesign) therefore give impetus to an expanding power of (un)certainty work and oversight, across the twentieth-century. Along this trajectory, confidence interval logics morph from handwritten statistical information on graphing paper, through punch-card ballistics analysis, to coded inputs in digital system processing.
The chapters of this dissertation: crisis, confidence, control, (un)certainty, and climate, are defined by war and crisis. The story begins in the aftermath of WWI in the context of a growing agricultural industrialism, expanding western capitalism, and drought management. In the lead-up to WWII, the rising aerial bombing economy then severs computational logics from their agrarian roots and assumes a vantage point from 10,000 feet, “bombsight optics.” In the aftermath of WWII, the U.S. war in Korea and the subsequent proxy wars were vectors for the expansion of (un)certainty work, originating in the firestorm bombing of North African beaches. Throughout the Cold War period, weather control programs, built with confidence logics, generated a new aerial-agricultural economy to be taken over by the management of automated decision-making systems. Designing Certainty ends where the story begins, with farm management. But this is now an agricultural economy that has incorporated the colonial and aerial perspectives emergent from decades of war.
Designing Certainty features the archives and work of Polish logician and statistician Jerzy Spława-Neyman, the confidence interval’s initial designer. I move away from a male figurehead genealogy and history and do not cast Neyman as the primary agent or “father” of CI logics. Rather, this is a history of the world he lived in, of the many actors, influences, and historical contingencies that contributed to the rise of (un)certainty computing as a dominant epistemological and political force. My research on CI logics spans over 20 archives and special collections and technical and cultural materials over a century-long period.