American Futurism in the Atomic Era

Democratization of the Sciences: The Changing Rhetoric of Science

      One aspect of the Cold War which made visions of the future unique involves the perception of science. Marketed examples of futurism in the United States saw an increase in prevalence during the Cold War years. While the topic of futurist imaginings gets significant attention in public history spaces due to the accessibility and appeal of the imagery, there is very little academic scholarship on the topic. What is available argues that the primary period of American futurism sits between the interwar years due to the introductions of new innovations of particular concepts of the future. However, while the images were first being produced then, they were not yet immersed into every day public life the way they would be in 1950s and 1960s.[1] 
      The dropping of the atomic bombs at the end of the Second World War damaged the view of science and progress for many people. This resulted in businesses and agencies pushing forth marketed materials to normalize the idea of superweapons in the world. Nuclear research was supposed to provide humanity with an answer to clean renewable energy and launch humanity into a more productive future. At the same time, leading social science experts, with assistance from government funded think tanks, were working to apply methods of science to their fields in order to contribute to the effort of predicting the future for potential threats.[2] In order to understand the impact of the world’s introduction to the atomic bombs it is necessary to look at the changes in science in the previous decades. Overall, the historiography of scientific progress is highly concerned with understanding the intellectual and psychological implications of modern technology and science on society. 

           Concepts of the future began to be seen in many forms of popular media by the 1920s as consumer habits prompted new marketing techniques. However, despite the early attempts to gain support for new products and with an abundance of new technologies available, the public wasn’t as receptive to these early efforts. Polls conducted by government agencies and private entities show that many people considered new technologies, such as television, to be short-lived fads.[3] Cities and industries were growing and transforming but many technological changes had not yet taken root on a wide scale as they would by the Cold War. This period was also less impactful in terms of long-term middle class consumerism with the Great Depression and in the late-1930s a second marketing push was stalled with the Second World War.[4]
      During the interwar era, science had made significant advances and continued during and after the Second World War with weapons and missile technology research. The concepts of alternative energy gained popularity with particle research and advancements in science. Moving closer to the Second World War the interest in researching nuclear energy and power can largely be attributed to reports of European scientists fleeing to the United States with knowledge and rumors of super weapons being researched in Germany. It remains under debate whether or not the rapid advancements from the start of the war by American research teams were because they were under pressure of an arms race which may not have existed to the severity that was feared and reported on.[5]
      While there are many examples of futurism from the 1920s to the 1940s, the end of the Second World War marks several significant changes in this theme. For one, technological advancements, such as household television, which had then been widely embraced, altered the way American citizens received information. It took time for those ideas of progress to root itself in societal consciousness, rather than just becoming a passing trend. Above all, however was the use of atomic weaponry on civilian populations towards the end of the Second World War, which altered the popular American conception of the impact of science and altered the moral perceptions, as well as raised the questions of whether or not limits should be imposed on scientific research. The bombs not only introduced the world to an important new advancement in science but it did it in a way which negatively shaped the public’s view of nuclear technology as a whole and with such influence, much of the work involved since has been either kept hidden or been only vaguely described. 
      At the beginning of the twentieth century, the atomic hypothesis was only just being embraced by scientists, but even that was still just a theory. The advancements in the first decades of the twentieth century, seen in chemistry, physics, and mathematics, not only catapulted humanity into a new age where science could produce practical advancements but also made possible the instruments of war which would define the modern era.[6] The changes were rapid and the scientists who studied the widely accepted theoretical laws about the universe, at the turn the of the century, were mostly the same individuals working on secret government projects by the Second World War, who were rewriting humanity’s greater knowledge about the physical universe.
      The perspective of these scientists are important to understanding futurism because of the changing discourse surrounding science education and their profession before and after the war. Before the Second World War, students of science in America had very loose standards. The curriculum was not the modern one intended to make well-rounded citizens. Instead students pursued a topic in science and focused on personal research with the guidance of faculty. Scientists involved in the Manhattan Project questioned the culpability of current researching scientists and many were voices in the argument to redefine the sciences and science studies in the American universities.[7] Previously, scientists portrayed themselves as unconcerned with societal affairs and tried to insulate research processes from political concerns.[8]
      The discourse following the war altered to encourage the idea of a general education which would have the purpose of creating scientists who emphasized human concerns about the potential affects their research avenues could have on humanity. James B. Conant, Philipp Frank, and J. Robert Oppenheimer, scientists influential to the changing standards of university education following the war, aligned themselves and their work with that of artists and writers.[9]This was a complicated argument that involved anger and disrespect for the categorization of social sciences, such as psychology and economics, as a science that would see human processes objectively quantified like matter and data in the physical sciences. Many scientists instead preferred to identify themselves as creative individuals like artists and writers.[10]  That is a relationship that, arguably, is as old as humanity’s ancient astronomers but directly ties into futurism from the century before. 
            As felt in the aftermath of the Second World War, the development of atomic instruments of war altered the public estimation of science as an instrument of progress. In Science, Democracy, and the American University, Andrew Jewett argues that the discourse of scientists in the post-Second World War period trickled down to affect American conceptions of the future. He explains how in the late 1940s, physical scientists who shaped discourse on science in America increased their efforts to define their objectives in the university system and separate themselves from the social sciences. He states, “they began to actively align themselves with literature and the arts . . . creativity anchored a postwar epistemological discourse that identified scientific knowledge as a product of individual creativity.”[11] Not only does this shed light on how scientists saw themselves and their work moving forward but it also shows how other fields increased efforts to apply scientific principles and methods. Despite it seeming like society would take a step back from the concept of scientific progress after the dropping of the bombs, people were still embracing the practices of scientists in various applications.
            However, there were other effects as a consequence of these changes. Not only were scientists redefining their work but they were considering the morality behind it. Before the Second World War it was not uncommon for a young individual to enter university and have significant control over the coursework they would complete or to work strictly on their personal research with a team or advisor. The concept of general education requirements in universities took prominence only after the Second World War. From the perspective of academics in the sciences they wanted to ensure that the young scientists considered societal implications before embarking on scientific research. This was an obvious response to, at the very least, the promoted guilt of those associated with the Manhattan project. There is a prevalent theme in the historiography, in which the subjective questions of hope for a better tomorrow or fear of the current situation are always at the center in an attempt to rationalize the sentiments of the era.[12]
            Hope and fear may have sparked movement in the discourse in regards to the, what should scientists do, and how should they be limited questions, but the discourse itself affected the advertised perceptions of science and the marketed future of tomorrow. The discourse that was ignited around the democratization of science and the redefining of scientists’ place in the universities is a significant aspect in explaining the images which were marketed to Americans in regards to the future.[13] Engineers began to separate themselves from the theoretical sciences and those concerned with the greater answers to the universe.[14] It was the physical scientist first that had a part in the products being produced and marketed. 
            Along with scientists redefining their roles, social scientists were pushing towards aligning their work with the physical sciences more directly. This aided in the legitimacy of prediction but also facilitated the transference of those ideals to businesses and the public. Futures studies developed as an intellectual movement during the 1960s and gained traction through the 1970s.[15]  Government funded thank tanks attempted to merge scientific principles and social policy. In 1963, with the use of mathematicians and statisticians, RAND Corporation developed the Delphi Method in an attempt to quantify the communist threat. While these predictions were aimed at military strategy, the goal behind it was centered on applying scientific principles and techniques to unquantifiable data in order to present objective arguments on subjective social relationships. 
            The merging and divisions of research fields were also reflected in how scientific research was being publicized. The advancement of nuclear technology in the 1950s was followed by research on intercontinental ballistic missiles. This led into the interest in breaking the Earth’s atmosphere. Aside from the scientific achievement of leaving the earth, this theory appealed as a strategic advantage militarily.It was thought it could help with spying, message transmission and provide easy access to other nations.[16] These ideas however were based on varying perspectives. On the one hand it would be a benefit to a nation’s power. On the other, the realization of such technology means other nations would have those same powers or for a time, would be ahead, hence, the advent of the arms race. Those fear based predictions on what missile and satellite technology could provide have all proven to be true.
 
[1] Lynn Spigel, “Portable TV: Studies in Domestic Space Travel,” 110-133. Consumers' interests had redefined the purpose of information technology such as the television, facilitating the spread of marketed visions of the future. 
[2] Theodore M. Porter, “Positioning Social Science in Cold War America,” Cold War Social Science: Knowledge Production, Liberal Democracy, and Human Nature.
[3] Marita Sturken and Douglas Thomas, “Introduction: Technological Visions and the Rhetoric of the New” 3-12. Many technologies are noted to be viewed as passing curiosities before consumer interest later transforms its original purposes into their own uses. 
[4] L. Winner, “Sow’s Ears…” 39-40.
[5] James Mahaffey, Atomic Awakening: A New Look at the History and Future of Nuclear Power, (New York: Pegasus Books, 2009). 
[6] Thomas S. Kuhn, The Structure of Scientific Revolutions, (Chicago: University of Chicago Press, 1996).
[7] Rhodes, The Making of the Atomic Bomb, 124. 
[8] Andrew Jewett, Science, Democracy, and the American UniversityFrom the Civil War to the Cold War, (Cambridge: Cambridge University Press, 2012), 310. 
[9] Ibid, 314.
[10] Ibid, 314
[11] Ibid, 314. 
[12] Heilbroner’s Visions of the Future; Mahaffey’s Atomic Awakening; Corn’s Yesterday’s Tomorrows; Barlow’s “The Future of Prediction.”
[13] C.P Snow, The Two Cultures, (New York: Cambridge University Press, 2008). C.P Snow, a professor of Physics at Cambridge, argued in his 1959 lecture that the oppositional attitudes of the scientific disciplines and the humanities was harmful to the very purpose of academic research and human progress.
[14] Jewett, Science, Democracy, and the American University, 315. 
[15] For further reading see works on The Futures Studies Movement
[16] Harold John Blackham, The Future of Our Past: From Ancient Greece to the Global Village, (University of Michigan Press: 1996) 259.

This page has paths:

This page references: