James Conca, Forbes
Special to In Homeland Security
This is a very, very important question, as relevant today as it was in 1943. Richard Rhodes recently gave a lecture at the Hanford site in Washington State for the 70th Anniversary of the Manhattan Project that provides more insight into this issue than any other I have ever heard. It is rare to get a glimpse of what emotions, paradigms, and philosophies motivate people during such world-changing events as entering a World War or developing atomic weapons.
But understanding humans and history is what Richard Rhodes does best.
Rhodes is renowned as the Pulitzer Prize – winning author of The Making of the Atomic Bomb as well as dozens of other non-fiction and fiction books, biographies and documentaries. But he has also been a scholar at Harvard, MIT and the Center for International Security and Cooperation at Stanford University.
For the present and the future, we need to understand these past events from the perspective of that time period, not our own. The horrors of that War – fifty million dead, hundreds of millions of lives destroyed, whole cities wiped off the map from run-of-the-mill carpet bombing – rarely make it into discussions of the ethics of developing the atomic bomb.
As much as we like to use hind-sight to judge such momentous deeds, we will fall prey to the same mistakes if we do not appreciate how social forces usurp both science and scientists and how the different forms of government stand between the different futures of our planet – some good and some bad.
With permission, I am reprinting Rhodes’ lecture below. Longer than normal posts, it is well worth the read. You will be amazed at the amount of information in this lecture that is essentially unknown to the public and even many of us in the field. It will certainly add another layer of insight into our present nuclear challenges in Iran, Pakistan and North Korea.
The reader should be struck foremost by the realization that the atomic bomb and its use did not occur in a vacuum. Even the concept of mutually-assured destruction, an idea that everyone assumes first appeared with nuclear weapons, actually came about decades earlier during WWI with the widespread use of chemical weapons.
So grab an espresso (it was called demitasse back then), sit back and be enlightened…
The Atomic Bomb and Its Consequences ©2013 by Richard Rhodes
Nuclear fission was discovered accidentally in Nazi Germany on December 21st, 1938, nine months before the beginning of the Second World War. It was a discovery that in the long run would sharply limit national sovereignty and change forever the relationship between nation-states, and it came as a complete surprise.
The German radiochemists Otto Hahn and Fritz Strassmann, working at the Kaiser Wilhelm Institute for Physical Chemistry in Dahlem, a suburb of Berlin, were bombarding a solution of uranium nitrate with lukewarm neutrons, transmuting microscopic quantities of the uranium into a brew of substances of differing characteristic radioactivities which the two chemists believed might include new manmade elements heavier than uranium as well as familiar elements like radium, one of uranium’s natural “daughters.” Instead of radium, however, Hahn and Strassmann found barium in their irradiated solutions, an element only about half as heavy as uranium that they had not expected to find there and that had not been there before. When they had consulted with their Jewish physicist colleague Lisa Meitner, who had escaped Nazi Germany to exile in Sweden, and when Meitner had consulted with her physicist nephew Otto Frisch, who came over from Denmark to visit her, it became clear that the unexpected barium was a marker for a new species of nuclear reaction—that in fact Hahn and Strassmann’s neutrons had split uranium nuclei, atomic number 92, into two nearly equal pieces, one of barium, atomic number 56, and one of krypton, atomic number 36: 56 plus 36 equals 92. The new reaction, converting a small amount of matter into energy, was fiercely exothermic, ten million times as much energy coming out as the neutrons carried in. Physicists had known for forty years, ever since the discovery of radioactivity, that enormous energy was locked up in the atom. Here at last was a way to release it. Otto Hahn, a veteran of the First World War, said later that he brooded on the probable military applications of his discovery and seriously considered suicide.
Word spread quickly across the small world community of physicists. Hahn and Strassmann published their results in the German science journal Naturwissenschaften, as scientists do. Frisch and Meitner told Niels Bohr, the great Danish physicist, and followed up with confirming physical experiments published in the British journal Nature. From biology, from the process whereby cells divide, binary fission, they borrowed a name for the new reaction—nuclear fission. Bohr carried the news to America early in 1939 in the course of attending a physics conference in Washington. Soviet physicists working in Leningrad under young Igor Kurchatov, British physicists, German physicists, the French team at the Radium Institute in Paris, American experimenters from coast to coast rushed to demonstrate nuclear fission in their laboratories with off-the-shelf equipment: the discovery, as the physicist Philip Morrison would say later, was “overripe” (1). A Japanese Army lieutenant general who was an electrical engineer, reading the report in Nature, assigned a member of his staff to track it. If Hahn and Strassmann hadn’t discovered nuclear fission in Germany, others would soon have discovered it in some other laboratory somewhere else. Here was no Faustian bargain, as some even all these years later find it comforting to imagine. Here was no evil machinery one or another noble scientist might have hidden from the politicians and the generals. To the contrary, here was a new insight into how the world worked, an energetic reaction older than the earth that science had finally devised the instruments and arrangements to coax forth. As the American theoretical physicist Robert Oppenheimer would say, “It is a profound and necessary truth that the deep things in science are not found because they are useful; they were found because it was possible to find them.”
The physicists saw immediately what might be done with the new reaction. Hungarian emigré physicist Leo Szilard told his American patron Lewis Strauss on January 25, 1939, that nuclear energy might be a means of producing power, and mentioned “atomic bombs” (2). The young Berkeley graduate student Philip Morrison remembered that “when fission was discovered, within perhaps a week there was on the blackboard in Robert Oppenheimer’s office [at Berkeley] a drawing—a very bad, an execrable drawing—of a bomb” (3). “These possibilities,” commented American theoretician Robert Serber, “were immediately obvious to any good physicist” (4). Within months of the German discovery, the Italian Nobel laureate Enrico Fermi would stand at his panoramic office window high in the physics tower at Columbia University, look down the gray winter length of Manhattan Island alive with vendors and taxis and crowds, cup his hands as if he were holding a ball and say simply, “A little bomb like that, and it would all disappear” (5).
Why would these men of good will, who believed themselves to be members of a peaceful international community of scientists, want to build a weapon of mass destruction? Always and everywhere in that first round of nuclear proliferation the same reason repeats: because possession of such a weapon appeared to be the only defense against an enemy similarly armed. Deterrence had already been debated publicly and at length during the 1930s in the context of aerial bombardment. It found its first documented expression in the context of nuclear weapons in a secret report prepared early in 1940 as a warning to the British government by two emigré physicists, Otto Frisch (again) and Rudolf Peierls, a report that also first laid out on paper the basic design and operation of an atomic bomb. “If one works on the assumption,” the two physicists wrote, “that Germany is, or will be, in the possession of this weapon, it must be realized that no shelters are available that would be effective and could be used on a large scale. The most effective reply would be a counter-threat with a similar bomb. Therefore it seems to us important to start production as soon and as rapidly as possible, even if it is not intended to use the bomb as a means of attack” (6).
The world was at war. The new tool of nuclear energy, like all tools, might also serve as a weapon. In the course of the Second World War, every major industrial nation began a program to build atomic bombs: the Germans, the British, the French before their surrender, the Soviets, the Americans, the Japanese. But nuclear-weapons development required a massive commitment of government funds, funds that would have to be diverted from the conventional prosecution of the war. If atomic bombs could be built, they would be decisive, in which case no belligerent could afford not to pursue them. But making that judgment depended on two corollary assessments: the first, whether or not such weapons were inventible—whether nature would allow such an explosion to proceed; the second, whether or not the enemy was capable of producing them in time to affect the outcome of the war. Both assessments depended critically on how much scientists trusted their governments and how much governments trusted their scientists.
Trust would not be a defining issue later, after the secret, the one and only secret—that the weapon worked—became known. This first time around, however, it was crucial, as the Russian physicist Victor Adamsky, who worked on the Soviet bomb, has pointed out:
The tension [between the scientists and their governments, Adamsky writes,] stemmed from the fact that there existed no a priori certainty of the possibility of creating an atomic bomb, and merely for clarification of the matter it was necessary to get through an interim stage: to create a device (the nuclear reactor) in order to perform a controlled chain reaction instead of the explosive kind. But the implementation of this stage requires tremendous expenses, incomparable to any of those previously spared for the benefit of scientific research. And it was necessary to tell this straight to your government, making it clear that the expenses may turn out to be in vain—an atomic bomb may not come out….The American nuclear scientists…addressed the President…directly and described that complicated situation to him…. After a number of procrastinations which are inevitable even in a democratic society, a decision was taken in the USA to make the research as comprehensive as required by logic, disregarding the [un]certainty of the final result.
…There was [no such confidence and mutual understanding] in Germany (7).
In the United States the trust was there, and President Franklin D. Roosevelt duly authorized a full-scale Anglo-American nuclear weapons program on October 9th, 1941. In Germany the trust was not there on either side, and the German program fragmented and stalled. After 1942, Werner Heisenberg, Otto Hahn, Richard von Weizsächer turned their attention to building a nuclear reactor and the bomb went by the board. Nor did the German scientists believe the Allies could do what they themselves had not judged feasible (8).
The French program was stillborn. The Soviets, fighting for their lives against an almost overwhelming German invasion, put their early work on hold, not entirely convinced of its importance, and revived it in 1943 after the Red Army pushed back the Wehrmacht outside Moscow and espionage revealed the extent of the developing programs in Britain and the United States. The Japanese saw that a bomb program was beyond their resources, estimated incorrectly that it was also beyond American resources, and scaled down their efforts to laboratory studies of uranium isotope separation.
The Anglo-Americans knew very little of these developments. Until 1944, they raced against an imaginary German clock, calculating that from the discovery of fission forward, the Germans might have at least a two-year lead. Then another and more terrible clock ticked off the project’s hours: the clock of the war itself, of the young men dying on the battlefields of Europe and Russia and the bloody Pacific beaches. The Germans, the British, the French had used poison gas in the last Great War, as they all said, to shorten the war and save lives; Robert Oppenheimer, recruiting scientists for a secret laboratory in New Mexico where the first bombs would be designed and built, whispered that he couldn’t tell them what they would be doing, but he could tell them that their work would end the war and save lives.
Before Oppenheimer began recruiting, Szilard, Fermi and their colleagues at Columbia and then at the University of Chicago had to accomplish the intermediate step Adamsky mentions: they had to build an experimental nuclear reactor to prove that it was possible to achieve a controlled chain reaction in uranium. This would be a slow-neutron chain reaction, multiplying in thousandths of a second and relatively easy to control, not the microsecond fast-neutron chain reaction that would proceed in a bomb, but fission was the source of the energy in both arrangements. The reactor design Szilard and Fermi worked out and jointly patented was a spherical assembly large as a two-car garage of graphite blocks drilled with blind holes into which would be inserted pucks or slugs of uranium oxide or metal. They needed 700,000 pounds of highly purified graphite; all the uranium metal they could get, which turned out to be 12,400 pounds; and some 80,000 pounds of oxide. None of these materials could be bought off the shelf; their manufacture had to be developed and subsidized.
The “secret” of the bomb would turn out to be industrial production on an enormous scale—2.2 billion-dollars-worth by the end of the war in 1945 dollars, about the same as it cost twenty years later to go to the moon. What began as a table-top experiment on a laboratory bench in Germany in 1938 became, in the United States, an industry comparable in scale to the United States automobile industry of the day (9). Niels Bohr had gone back to Denmark in 1939 secure in the conviction that no nation could afford to build such an industry in time of war. The United States not only did so; it did so redundantly, pursuing three different and expensive paths to accumulating the necessary quantities of fissionable materials. The Manhattan Project, as the program to build atomic bombs was named, commanded a higher priority for materials and staff than any other program of the war—not because anyone thought the atomic bomb would win the war, but because its sole possession by an enemy might turn Allied victory abruptly into defeat.
Fermi called his construction a “pile” because it was made that way, by piling up layers of uranium-slugged graphite bricks crosswise one on top the other to achieve a critical mass. The pile would not only prove the chain reaction; it would also be a model cauldron for transmuting from uranium the first manmade element, plutonium, Berkeley radiochemist Glenn Seaborg’s discovery, several times more fissionable than uranium itself. Fermi’s pile went critical in a doubles squash court under the stands of Stagg Field at the University of Chicago on December 2, 1942, one year almost to the day after the Japanese attack on Pearl Harbor.
Scientists at the Metallurgical Laboratory of the University of Chicago determined the parameters of the site where plutonium would be produced for the first atomic bombs. Thirty-three-year-old Army Corps of Engineers Colonel Franklin T. Matthias, known to his friends as “Fritz,” wrote the requirements into his diary after a meeting at DuPont’s home offices in Wilmington, Delaware, on December 14, 1942: The site needed to be spacious enough to accommodate a manufacturing area of approximately 12 by 16 miles, with no public highway or railroad nearer than 10 miles, no town of greater than 1,000 population nearer than 20 miles, an available water supply of at least 25,000 gallons per minute and an electrical supply of at least 100,000 kilowatts. Matthias looked in the Grand Coulee area of Washington and at several sites in Tennessee before flying over Hanford in an Army observation plane. “I came back over [the] Horse Heaven [Hills],” he remembered many years later— “in the area northeasterly from Plymouth—and over Rattlesnake Mountain to the Hanford site from the west, and I got over that mountain, and I had looked at everything else, and I knew that was it, right then.” His boss, Brigadier General Leslie Richard Groves, agreed, and the Corps of Engineers began land appraisals at the Hanford site in January 1943.
The first great question was what kind of cooling system to use in the production reactors that would be built and operated at Hanford to make plutonium. Graphite would serve as moderator, uranium metal as fuel. The fission chain reaction would release tens and hundreds of thousands of kilowatts of energy, and since these reactors were being built to breed plutonium, that energy would not be used to make steam to generate electricity but would have to be transferred away. The first chain reaction in the natural-uranium squash court pile at the University of Chicago had operated with a barely positive reactivity of 1.006, so conserving neutrons was an important consideration. Helium, which absorbed no neutrons at all, was the coolant of choice at first, but Hungarian theoretical physicist Eugene [Vegner] Wigner, trained as an engineer, held out for water despite its neutron-scavenging propensities because it would be simpler and thus faster to engineer. Wigner judged that they would improve reactivity in the big production reactors with purer materials. He was convinced that Nazi Germany was ahead of the United States in bomb development. Wigner even moved his family out of Chicago in December 1943, when he estimated the German head start might have given them atomic bombs by then—which he thought Germany would, logically, drop first on the Met Lab.
Eventually Enrico Fermi, General Groves and DuPont’s Crawford Greenewalt came round to once-through water cooling. Wigner designed an elegant reactor: a three-story assembly of graphite blocks drilled through with a cylindrical lattice of channels into which could be inserted slugs of uranium metal clad with aluminum. Water from the Columbia River would flow through the channels around the slugs for cooling, and when the slugs had been sufficiently exposed to the pile’s neutron flux to breed about a dime’s weight of neptunium, element 93, per ton of uranium, they could be pushed out the back into a cooling pool, where the neptunium would quickly decay to plutonium, element 94.
Which brings up one of the mysteries I mentioned. The B Reactor at Hanford was the first to go critical, late in the evening on September 26th, 1944. Early the next morning the power was increased to 9 megawatts and held there. Then, to everyone’s surprise and consternation, the reactivity began slowly to decrease, at a rate that would drop the reaction below criticality at about six in the afternoon. To slow any possible water leak they reduced the pressure, which dropped the power to 200 kilowatts, but the reactivity continued to decline, at which point they decided to shut the reactor down and hunt for leaks.
When Crawford Greenewalt returned with Fermi the next morning, September 28th, he wrote later, he “found that the pile had died according to prediction, but had mysteriously come to life starting at about 1 a.m. today. The reactivity had increased steadily,” Greenewalt continues, “and at 7 a.m. they started controlling the power at 0.2 MW. From this time on the activity kept increasing….During the night an attempt had been made to find leaks but neither conclusively or successfully.” They continued trying to check for leaks, but by now they had come to believe that something was poisoning the reaction, and that evening, to test their suspicion, they raised the power to 9 MW, Greenewalt wrote in his diary, “and the earlier phenomenon repeated itself almost exactly: the reactivity first flattened off, then decreased…. At midnight we dropped the power again to 0.2 MW and when I left at 2:30 a.m. the loss of reactivity was decreasing and looked definitely as though it was going to turn up.”
Since the reactivity seemed to be cycling with the increase and then decrease of the poison, they thought of two possible explanations: either the reactor’s radiation was causing some substance to deposit on the slugs and tubes—which the cooling water then dissolved off when the pile power was reduced—or some short-lived fission product was decaying to a longer-lived radioactive daughter with a large appetite for neutrons. From the data on the changes in the pile reactivity Greenewalt plotted the half-life of the daughter at 11.7 hours, but, he wrote, they “couldn’t think of any reasonable radioactive process which would produce the results.”
By the morning of Friday, September 29, however, physicist John Wheeler had solved the mystery. The offender was a fission chain after all. The most likely, Wheeler thought, was 6.6-hour iodine 134 decaying to 9.1-hour xenon 135. The loss of reactivity the xenon had caused meant it had thirty times the appetite for neutrons of any isotope previously known. Wheeler calculated that they could override the poisoning by increasing the pile’s reactivity by 1.3 percent, which they could do by loading more channels with slugs—up to 1,500 channels and, if necessary, 2,000.
Why was the reactor built with extra, unused channels? The accepted explanation, which I think may have come from postwar reminiscences by Fritz Matthias, is that DuPont engineers were conservative and wanted to leave a margin of safety in case a problem cropped up. In contrast, [Vegner] Wigner, trusting his calculations and wanting to move ahead as fast as possible to beat the Germans, had designed a lattice—the horizontal cylindrical arrangement I mentioned—that made optimum use of the minimum necessary number of channels. But it wasn’t simply DuPont conservatism that led Crawford Greenewalt to order extra channels drilled through the corners of the cubical graphite structure. John Wheeler had assured Greenewalt that there were no unknown decay products that would poison the chain reaction, and Greenewalt seems to have accepted Wheeler’s assurances. (Wheeler’s overconfidence may explain why he hustled so quickly to identify the isotopes that DID cause the poisoning.)
Greenewalt was in fact concerned with a different problem: water corrosion of the cladding around the uranium slugs, which could lead to leakage of the highly radioactive fission products into the cooling water and thus into the environment. It was possible, he realized, that the uranium slugs might have to be double-canned to prevent them from corroding, in which case the extra aluminum might scavenge enough neutrons to quench the chain reaction. To prepare for that possibility, Greenewalt ordered the corners of the reactor blocks drilled with extra channels where more uranium might be inserted to increase the reactor’s flux and override the aluminum can problem if it emerged—and fortuitously, the channels were then available to override the xenon poisoning no one had expected. Greenewalt’s decision was crucial, and he made it despite contrary advice from the Met Lab leadership. Had DuPont followed the Met Lab’s overconfident advice, the entire Hanford plutonium production program would have been stalled until new production piles could be designed and built, and the United States would have produced only one atomic bomb, the uranium bomb, Little Boy, in time to affect the outcome of the war.
The new channels were quickly lined, piped and loaded, and on November 24th, 1944, the B Reactor’s first irradiated slugs were pushed into the cooling pool.
The D reactor went critical on December 17th, 1944, at 11:11 in the morning, and on December 26th, the first charge of B reactor metal was dissolved in separation building 200-N.
By then, the Hanford Engineer Works was the third largest city in the state of Washington, a thriving society that Fritz Matthias oversaw. The baseball season had opened the previous summer when six crafts had fielded teams. Where orchards and dusty scablands had been, a community of almost 50,000 people had sprung to life a safe distance south of the futuristic piles and canyons and was finding its identity. “At the game Sunday,” Matthias noted proudly, “there were probably 5000 people watching. The plan is that three games will be played every Sunday until the end of the season.” There were churches now in the Hanford residential areas as well as schools, barber shops, beauty salons—and bars. Native Americans still pulled salmon from the river in season at a camp that dated back to the days of Lewis and Clark; and since the camp was within the secure area, Matthias had agreed to supply the tribe with trucks to haul its salmon out for smoking.
Six plutonium extraction runs were processed during January 1945, resulting in a “charge” of plutonium of 97 percent purity. “The charge,” the DuPont History reports, “was loaded into a ‘sample can’…on February 1st, 1945. Because the closure on the sample cans had been shown to have a high probability of leakage,” the History explains, “it was decided to evaporate the product solution nearly to dryness after loading. This was done on the first and all subsequent shipments.” According to Matthias, the quantity involved was “72,000 units,” which probably means 720 grams—three-quarters of a kilogram—painstakingly extracted from tens of tons of dissolved uranium. Matthias himself made that first delivery, on February 5th, 1945: “I drove from Hanford to Portland,” he remembered. “I had a guy with me and we had a locked space on the train from Portland to Los Angeles. [The container with the plutonium was] about a two-foot cube, wrapped up in wrapping paper and ropes, and inside was a test-tube thing suspended and secured—all surrounded by lead and rigged so it stayed right in the middle of that box. It was quite a heavy thing, and I carried it just like a box any traveler might have with him.”
F Reactor went critical on February 25th, 1945, and another shipment of product left Hanford on March 1st. The F Reactor was soon running smoothly, one of three now that were breeding plutonium around the clock. Matthias was able to inform Groves early in March that 10 kilograms of plutonium—enough for two bombs—would be ready for shipping between April 18th and July 12th. The first 5 kilograms would be used to test the implosion system Los Alamos had invented; the second 5 kilograms would be destined for Japan. In a memorandum Groves prepared for President Truman on April 23rd, 1945, shortly after the death of Franklin Roosevelt, he resolves another mystery—whether we would have used the bomb on Germany had it been ready before the German surrender in May. “The target,” Groves says he told the President, “is and was always expected to be Japan.”
By May 3rd, Matthias calculated that 20 cans of plutonium—about 3 kilograms—had left Hanford for Los Alamos. After the initial deliveries, Matthias had begun delivering the cans by army ambulance to Salt Lake City, where they were transferred to another ambulance driven up from New Mexico. The shipments took only two days, start to finish, to reach Los Alamos, Matthias noted proudly in his diary, “far better than could be done by train.” He noted VE Day—Victory in Europe, the defeat of Nazi Germany—on May 8th, prompting himself to make sure the War Department film “Two Down and One to Go” was shown to remind Hanford workers that the Allies had defeated Fascist Italy and Nazi Germany but were still fighting a war in the Pacific against the Japanese. He was happy with plutonium production results, which he attributed to “the reduction of initial cooling periods” which permitted “processing of pushed material at an earlier date than scheduled.” Colonel Kenneth Nichols, Groves’ deputy, was able to write Los Alamos director Robert Oppenheimer on June 1st promising cumulated production and delivery of 7 kilograms of plutonium by June 1st, 13 kg by July 1st, 20 kg by August 1st, 26 kg by September 1st, 40 kg by October 1st and 54 kg by October 31st. At 5 kilograms per bomb, that would be enough for 10 bombs, with a little left over.
As there were two kinds of nuclear materials, uranium and plutonium, so there were two kinds of bombs. The more conservative design, nicknamed Little Boy, was a six-foot cannon with a cylinder of U235 fitted inside the muzzle and an assembly of stacked rings of the same material to be fired up the barrel at the appropriate time like an artillery shell. When the rings slammed into place around the target cylinder they formed a supercritical mass and chain-reacted. The gun-type bomb was grossly inefficient, but experiments with diluted uranium confirmed that it was reliable, so much so that it was certified for use without proof testing. In any case there wasn’t enough highly-enriched uranium for a test; the first Little Boy built was the one that was used.
Los Alamos, Oppenheimer’s secret laboratory on an extension of the vast Valle Grande caldera northwest of Santa Fe, had discovered to its horror in the spring of 1944 that an admixture of plutonium 240 and higher isotopes in Hanford reactor-bred plutonium made that material so unstable that a stack fired up a gun barrel even at 3,000 feet per second would melt down before it had time to mate and explode at full yield. Oppenheimer ordered a massive shift of laboratory priorities that summer to develop an alternative method of assembling a critical mass, a method the physicists named implosion: using a lensed sphere of high explosives to squeeze a subcritical sphere of plutonium to critical density.
The laboratory worked night and day for the rest of 1944 and the first half of 1945 to develop implosion; the technology was sufficiently unreliable even then to require a full-scale test. The test, in the New Mexican desert northwest of Alamogordo, counted down to zero just before dawn on July 16th, 1945, the first full-scale, manmade fast-neutron chain reaction, exploding with a force equivalent to that of 18,000 [tons] of TNT, 18 kilotons, a great fireball lighting up the predawn morning, thrusting into the pristine desert air on a stem of roiling gas and smoke. “No one who saw it could forget it,” said the director of the test, Ken Bainbridge; he called it “a foul and awesome display” (10). That same morning, the destroyer Indianapolis steamed out of San Francisco Bay carrying Little Boy and its uranium bullet, bound for Tinian Island in the Marianas, a thousand miles from Japan, where the specially-configured B-29’s that would carry the weapons to the Japanese homeland had staged out in June and July. Little Boy’s target assembly followed by air on July 26th. So did two high-explosive assemblies for plutonium bombs—the spherical implosion weapon nicknamed Fat Man, to be used after Little Boy, and a second Fat Man for which a plutonium core would be ready on August 12th. More atomic bombs would be ready if needed, Oppenheimer projected in late July: “…from possibly three in September,” he told Groves, “to we hope seven or more in December” (11).
President Harry Truman was waiting eagerly at the Potsdam Conference for word of a successful test. It bucked him up. The Soviet Union was still officially neutral in the Pacific War; Stalin had promised to declare war and begin fighting the Japanese on August 15th, and until the news came of the successful test in New Mexico, Truman’s major concern had been to shore up Stalin’s commitment. The test changed the stakes; now Truman wanted to end the war before the Russians joined it, to avoid a political division of Japan like the political division developing in Germany. As Truman confided to his diary, “Believe Japs will fold up before Russia comes in. I am sure they will when Manhattan appears over their homeland” (12).General George Marshall, the Army chief of staff, whose judgment everyone respected, remembered later why there was not more discussion in those final days of summer about demonstrating or pocketing the bomb:
We regarded the matter of dropping the [atomic] bomb as exceedingly important [Marshall said]. We had just gone through a bitter experience at Okinawa [the last major island campaign, when the United States lost more than 12,500 men killed and missing and Japan more than 100,000 killed in eighty-two grim days of fighting]. This had been preceded by a number of similar experiences in other Pacific islands, north of Australia. The Japanese had demonstrated in each case [Marshall goes on] [that] they would not surrender and would fight to the death….It was expected that resistance in Japan, with their home ties, would be even more severe. We had had the one hundred thousand people killed in Tokyo in one night of [conventional fire-] bombs, and it had had seemingly no effect whatsoever. It destroyed the Japanese cities, yes, but their morale was not affected as far as we could tell, not at all. So it seemed quite necessary, if we could, to shock them into action….We had to end the war [Marshall concludes], we had to save American lives (13).
And Japanese lives as well, I might add. And in truth the first atomic bombs would not be even a quantitative extension of the destruction strategic bombing had already wreaked on the cities of Japan; since late April, Air Force General Curtis LeMay’s B-29’s had been systematically firebombing Japanese cities one by one to utter destruction, killing hundreds of thousands of civilians in the process; by August 1st the B-29’s were burning down cities of less than 50,000 population, almost the only cities left in Japan to burn. Hiroshima and Nagasaki survived to be atomic-bombed only because they had been deliberately reserved as targets, so that the effects of those first bombs could be assessed.
The devastation of Hiroshima on August 6th was complete. Little Boy yielded 15 kilotons. Of 76,000 buildings in the southern port city spread across the seven estuaries of the Ota River, 70,000 were damaged or destroyed, 48,000 totally. Ninety percent of all Hiroshima medical personnel were killed or disabled. Up to September 1st at least 70,000 people died. More died later of the effects of radiation.
George Marshall remembered that he was surprised and shocked that the Japanese didn’t immediately sue for peace. “What we did not take into account,” he said, “…was that the destruction would be so complete that it would be an appreciable time before the actual facts of the case would get to Tokyo….There was no communication for at least a day, I think, and maybe longer” (14). The Air Force distributed millions of leaflets over Japanese cities in the next several days suggesting that skeptics “make inquiry as to what happened to Hiroshima” and asking the Japanese people to “petition the Emperor to end the war” (15). Conventional bombing continued as well.
But there was now a struggle for power between civilian and military leaders within the Japanese government, no surrender emerged, and on August 9th Fat Man exploded over Nagasaki with a 22-kiloton yield, killing at least another 40,000 people and devastating another Japanese city. The Soviet Union entered the war as well, confronting the Japanese leadership with fresh armies and navies attacking in Manchuria and down from the north into Hokkaido. Finally, breaking tradition, Emperor Hirohito insisted that the government communicate its surrender, and reluctantly it did. In his historic broadcast to his people on August 15th, Hirohito specifically cited “a new and most cruel bomb, the power of which to do damage is indeed incalculable, taking the toll of many innocent lives” as “the reason why We have ordered the acceptance of the provisions of the Joint declaration of the Powers” (16).
The atomic bombs that exploded over Hiroshima and Nagasaki didn’t win the Pacific war, but contributed crucially to ending the war. What might have followed had the bombs not been used, no one can say with certainty, except that the two cities that were atomic-bombed would certainly have been firebombed, probably to equivalent loss of life. The Japanese might have surrendered. Or the Allies might have had to invade the Japanese home islands, as they were vigorously preparing to do. The Russians would have joined in that invasion, and having done so would certainly have insisted on a larger share of the spoils than the Kuril Islands. Japan might have been partitioned as Korea and Germany were partitioned.
Was it necessary to drop the bombs? Were they weapons of mass destruction? Was their use a crime against humanity? I think such questions as these beg the real question, which is why war became so much more destructive in the first half of the 20th century than ever before in the history of our species. It did so, I believe, because efficient killing technologies made the traditional exercise of national sovereignty pathological. To quote the United States Strategic Bombing Survey:
“The number of civilian deaths in Japan greatly exceeded the number of strictly military deaths inflicted on the Japanese in combat by the armed forces of the United States. This statement is pregnant with significance, for if there still be doubt that the emphasis in warfare has shifted from military forces to the civilian population, then this fact should dissipate all uncertainty (17).”
With disease, blockade, famine and fire it had sometimes been possible to kill large numbers of people in the past. The 20th century arsenal of massed artillery and aerial bombardment made such slaughter more certain and more efficient—made it industrial, mass-produced it, so that the number of deaths became a direct function of time and resources invested in the work, death cranked out like sausages or cars. The atomic bomb was the culmination of that trend, a mechanism that visited total death upon its targets cheaply, indiscriminately and almost instantaneously: whether or not people died at Hiroshima and Nagasaki depended not on their identities—whether combatants or noncombatants, Korean forced laborers, American prisoners of war, pregnant women, children, grandmothers, newborn babies or Shinto priests—but merely on the accident of their distance from ground zero that day.
The closing days of the Second World War mark a turning point in human history, the point of entry into a new era when humankind for the first time acquired the means of its own destruction. Niels Bohr liked to say that the goal of science is not universal truth. Rather, Bohr thought, the modest but relentless goal of science is what he called “the gradual removal of prejudices” (18). The discovery that the earth revolves around the sun removed the prejudice that the earth is the center of the universe. The discovery of microbes removed the prejudice that disease is a punishment from God. The discovery of evolution removed the prejudice that Home sapiens is a separate and special creation. The discovery of how to release nuclear energy, and its application to build weapons of mass destruction, is gradually removing the prejudice on which war itself is based: the insupportable conviction that there is a limited amount of energy available in the world to concentrate into explosives, that it is possible to accumulate more of such energy than one’s enemies and thereby militarily to prevail. So cheap, so portable, so holocaustal did nuclear weapons eventually become that even nation-states as belligerent as the United States and the Soviet Union preferred to sacrifice a portion of their sovereignty—preferred to forego the power to make total war—rather than be destroyed in their fury. Lesser wars continue, and will continue until the world community is sufficiently impressed with their destructive futility to forge new instruments of protection and new forms of citizenship. But world-scale war at least has been revealed to be historical, not universal, a manifestation of destructive technologies of limited scale. In the long history of human slaughter, that is no small achievement.
These are hard truths, but total war is harder. As the Polish mathematician Stanislaw Ulam, the coinventor of the hydrogen bomb, comments in his autobiography, “It is still an unending source of surprise for me to see how a few scribbles on a blackboard or on a sheet of paper could change the course of human affairs” (19).
(1) Personal communication. (2) Spencer Weart and Gertrude Weiss Szilard, eds., Leo Szilard: His Version of the Facts. MIT Press, 1978, p. 62. (3) Quoted in Charles Weiner, ed., Exploring the History of Nuclear Physics. AIP Conference Proceedings No. 7. American Institute of Physics, 1972, p. 90. (4) Robert Serber, The Los Alamos Primer. University of California Press, 1992, p. xxvii. (5) Quoted in Daniel J. Kevles, The Physicists. Vintage, 1979, p. 324. (6) Serber, op. cit., Appendix I: The Frisch-Peierls Memorandum, p. 82. (7) V. B. Adamsky, “Becoming a Citizen,” in B. L. Altschuler, et al., eds., Andrei Sakharov: Facets of a Life. Editions Frontières, 1991, p. 26, (translation edited). (8) Evidence to this point is abundant in the Farm Hall tapes. (9) Bertrand Goldschmidt, Atomic Adventure. Pergamon, 1964, p. 35. (10) Kenneth Bainbridge, in Jane Wilson, ed., All in Our Time. Bulletin of the Atomic Scientists, 1975, p. 230. (11) U.S. National Archives, Manhattan Engineer District papers, 5E, Terminal cables. (12) Quoted in Robert H. Ferrell, ed., “Truman at Potsdam.” American Heritage, June-July 1980, p. 42. (13) Quoted in Leonard Mosley, Marshall. Hearst, 1982, p. 337ff. (14) Quoted in Mosley, op. cit., p. 340. (15) J. F. Moynahan to L. R. Groves, May 23, 1946, MED 314.7, History. (16) Quoted in Herbert Feis, The Atomic Bomb and the End of World War II. Princeton University Press, 1966, p. 248. (17) United States Strategic Bombing Survey. Garland, 1976. Vol. X, p. 2. (18) Niels Bohr, Atomic Physics and Human Knowledge. John Wiley, 1958, p. 31 (19) S. M. Ulam, Adventures of a Mathematician. Charles Scribner’s Sons, 1976, p. 5.