Date of Award
Doctor of Philosophy (College of Arts and Science)
Schools and Centres
Arts & Sciences
Professor Deborah Gare
Professor Charlie Fox
The weapon first created by atomic scientists of the 1940s was unprecedented in its power and potential to kill. Not only can it destroy infrastructure and all living things over a wide area, it leaves a haunting invisible footprint of radiation that can continue to harm long after its heat has dissipated. The atomic bomb was first conceptualised, proven and built by civilian scientists and overseen by an ambitious military and wary bureaucrats. The scientists belligerently lobbied their governments to take the potential of atomic weaponry seriously and it is hence not surprising that they are often portrayed as ghoulishly mad savants who strung the bow of mass destruction.1
The atomic bomb proved such an effective killing machine that it provoked the Anglo- Australian physicist, Sir Ernest Titterton, to include a chapter in his 1956 book, Facing an Atomic Future, entitled ‘The Economics of Slaughter’.2 Titterton presented grotesque calculations that suggested atomic weaponry could kill for as little as ‘2½ d [pence] per man, woman and child’.3 The atomic bomb, as we know, played a decisive hand in the end of the world’s most deadly war—World War Two. During the Cold War the role of the atomic bomb—and its even more devastating offspring, the thermonuclear hydrogen bomb—caused tension, anxiety and outright fear as the world’s superpowers faced off in an arms race in which all-out conflict could have resulted in the end of humanity.
The story of the twentieth century is, in many respects, the story of the atom. During the early years the investigations into the structure of the atom were centred in powerful European nations such as Britain, Germany and France. But during the war the United States borrowed scientists and the knowledge from Europe and combined it with resources and enterprise to efficiently produce the technology for the final vanquishing moments of World War Two. This rise of American atomic utility continued into the Cold War arms race. In addition, postwar, industry looked in wonderment at the technology achieved during the war and saw how productive large groups of collaborating scientists could be. The postwar technological age was, in part, a product of a change of mode in scientific research from the university to government, military, and private enterprise.
The origins of the atomic age can be traced to Henri Becquerel and Marie and Pierre Curie’s discovery of radiation in the late nineteenth century; Albert Einstein’s Special Theory of Relativity in 1905; and Ernest Rutherford’s proof on the structure of the atom in 1909.4 The atomic age reached a crescendo with the dropping of atomic bombs that smote Japan in August 1945. There are several names that history links particularly to the atomic bomb, including the Germans Otto Hahn and Friedrich Strassman, who split the uranium atom in 1938; Austrians Lise Meitner and Otto Frisch, who first explained this as nuclear fission in 1939; the Hungarian Leo Szilard, who theorised an uncontrolled nuclear explosion in the same year; Enrico Fermi, the Italian who built the first nuclear reactor; and the eccentric American polymath, Robert Oppenheimer, who led the Manhattan Project to build the first bombs. Yet in the background was Mark Oliphant—a remarkable Australian scientist whose intellect, likeable and roguish personality, and international friendships helped stitch together this vast patchwork of scientists that made the bomb possible.
Holden, D. (2019). Mark Oliphant and the Invisible College of the Peaceful Atom (Doctor of Philosophy (College of Arts and Science)). University of Notre Dame Australia. https://researchonline.nd.edu.au/theses/270