Quantcast

Monday, February 28, 2011

THE PLAYLIST 4: DOO WOP


Chapter Four
Rama Linga Wop Bop and Doo-Wop
    Doo-wop is street corner harmony. Friends from church, the neighborhood or school would gather on a mutually convenient city street corner and sing. The groups usually had four or ideally five guys, comprised of two tenors, one guy who could swing from a baritone to a bass, and, if they were lucky, someone who could fake a strong falsetto. Add some snapping fingers, clapping hands and slapping thighs, stir in the melodrama of Gospel arrangements with the secular focus of young love, and you had the sound of 15,000 young blacks and Italians on a march for local stardom. The term “doo-wop” itself comes from the prominent background rhythms the majority voiced to push to soloist’s singing into the forefront. “Rama Lama Ding Dong,” “Bomp She Bop,” “Oop She Boop,” “Bom Bama Bom” and other glorious nonsense syllables propelled a small percentage of these young folks into the status of potential hit-makers. Once the record companies began gobbling these kids up on their way into the studio, some minimal soul-inspired instrumentation was added for balance, but tinkling pianos, softly groaning saxophones, and meandering guitars always blurred behind the sharp focus of these, the greatest of all doo-wop songs.
To access the rest of this article and also gain complete access to the thousands of articles archived on Philropost, simply click the SUBSCRIBE button below. Membership is only $10.00 for a full year.


Sunday, February 27, 2011

THE PLAYLIST 2 & 3


Solo By Myself: Male Soul Singers
     Atlantic Records thrived on an attitude of experimentation within a framework of self-discipline. Founding brothers Ahmet and Neshui Ertegun willed into existence the first independent label whose very lifeblood surged for the sole purpose of fertilizing an environment in which the best of all music would flourish. The Erteguns and financier Herb Abramson launched their enterprise in 1947. As studious jazz hounds, the brothers recognized that great music required more than a great band and a great singer. The people behind the microphone contributed just as much—and often more—than the more visible artists whose names appeared on the finished products. So Ahmet was moderately delighted to have the singers Ruth Brown, Joe Turner and “Stick” McGhee on his roster. He was outstandingly delighted to have producer Jerry Wexler, engineer Tom Dowd, and arranger Jesse Stone. Alone and together, these three men, along with Ahmet’s quirky knack for recognizing marketable songs (and a business savvy second to no one’s), polished and grooved out a Latin dance-based, saxophone-driven sound that influenced the next decade-and –a-half of R&B and soul artists. Through sheer will and drive these impresarios created some of the most endearing and moving music of all time.
Music That Matters

Atlantic Records

     While the Erteguns were signing their first acts, McMinnville, Tennessee’s own Randy Wood noticed that the electrical appliance store he owned and operated never seemed to have any of the records his customers wanted. What they wanted was rhythm and blues. The manufacturers and distributors of R&B knew they had a market; they just didn’t know how to get to that market. Wood expanded his store first into a mail order house, then converted it into a full time record store, and once he could no longer keep up with the demand, he launched his own label. He called it Dot. Soon discs by Ivory Joe Hunter, Brownie McGhee, and Shorty Long would display the Dot label.
     Dot was instrumental, as it were, in fomenting a trend that took the meat from a lot of music originating on its own label. In the early 1950s, three main types of radio programming existed: pop (which meant strictly white songs for white listeners), country music (still marketed for whites, but considered less refined than the pop stations), and rhythm and blues (aimed squarely at the black market, at least until disc jockey Alan Freed aimed it at everybody). Recognizing the profit potential in the pop market, Wood began recording cover versions of R&B acts using squeaky-clean nonentities such as Pat Boone and Billy Vaughn. So, for instance, Imperial Records would record Fats Domino’s version of “Ain’t That a Shame,” which played heavily on the R&B station, and Dot would redo the song for Pat Boone and release it as “Isn’t That a Shame.” But the best artistic decisions don’t always (if ever) happen in the boardroom. The label’s Nashville representative liked the sound of a young Arthur Alexander and signed him to work with producer Rick Hall. The label’s heritage to soul music was firmly in place.
     Cover songs were nothing new to Hugo Peritti and Luigi Creatore, two record producers who entered the business making stars out of people such as Mercury Records’ Georgia Gibbs, the queen of covers. Hugo and Luigi, as they remain known, used their earnings to produce easy-listening hits for Perry Como. But their supreme contribution to soul music came from their work with a young Sam Cooke, fresh from his Gospel days with the Soul Stirrers. The producers yearned to stay with a formula they understood: slick slop for goyem. But Cooke’s magic emerged in spite of their efforts to soften his edges. The result was a no-compromise combination for RCA that remains among the most passionate soul works ever waxed.
     Thanks to the pioneering work of these labels and producers, the foundation existed for the likes of Bobby Bland, Solomon Burke, Ray Charles and others to scale down the often ornate and lush arrangements while adding grit and husk to their own sound, thereby shifting the sound of R&B into a far more dance-oriented music that still retained the earliest jubilee effects of Gospel.

Ray Charles. Ray Charles Live! Atlantic. 1973.



     Before Ray Charles became a narcissistic, jingoistic pander to the worst aspects of the national character (and not coincidentally a recipient of Pepsi’s largesse), he earned his living as a big deal blues hound whose sped-up salties, deeply incandescent vocals and innovative piano figures made his disquieting physical jerks and spins acceptable to whites. He was, deservedly, the best-selling black performer of the pre-Motown1960s. Live!, which gives great moments from 1958 and 1959 (just when his career was crossing over and taking off), offers the playful, the serene, the caustic and bullying, the upbeat and the borderline psychotic. With two great versions of  "The Right Time," the gloriously agonized "Drown in my Own Tears" and the show-stopping interplay between Charles and the Rayettes on  "Wha'd I Say" the others stand out effortlessly and with grandeur. Music theorists may emphasize tension and release; Ray Charles the practitioner elicited a tension so pleasurable, the release was unnecessary. 

Bobby “Blue” Bland. The Best of Bobby Bland. MCA. 1974.Bobby Blue Bland

     To be smooth and to shimmy-grind at the same time, and to do both well is a rare thing. But “Blue” has been doing both extremely well since the late 1950s. He hasn’t had much pop success, but he sure do sound good, especially on  "I Pity the Fool,"  "Turn on Your Love Light" and "Farther on up the Road." The drums and trumpets on the latter song, for example, create an incredibly seductive tension between one another. Ta-daw tuh-da-taw tee-dah! scream the horns. The drummer takes his cue and rattles off the percussive opposites just as Bland sneaks in with the words “Somebody’s gonna hurt you like you hurt me,” as smooth and angry a moan as ever was recorded. The key to his artistic success is his ongoing ability to let the listener hear bitterness, tragedy and occasional rage behind the audial mask of total self-control.

Jackie Wilson. The Jackie Wilson Story. Epic. 1983.Jackie Wilson

     Everyone from Elvis Presley and Van Morrison to Chrissie Hynde has played homage to Wilson, an appropriate state of affairs given the strength of the man’s work (at least the work captured on this album) and the horrible, lingering debilitation that spent nearly ten years in causing Wilson’s death. But in his life there were orchestras happening in the high notes he reached on "Higher and Higher." Scaled-down symphonies duck-walked through "Reet Petite": and "Doggin' Around" and especially "That's Why." Most collections overachieve (after all, Jackie Wilson released ninety-nine singles and thirty albums, not counting posthumous issues) and include far too much of his overly polished work that reeks of someone trying too hard to go showbiz, sort of a black Bobby Darin, if you will. This is the ultimate, spared-down dream collection. 

Brook Benton.  "Rainy Night in Georgia." Atlantic. 1959.
     Tony Joe White, whose only other claim to recognition was in singing his own “Polk Salad Annie,” wrote this perfect mood piece for Brook Benton, whose only other claim to recognition was in writing “A Lover’s Question” for Clyde McPhatter.

Sam Cooke

"You Send Me."
"Only Sixteen."
"Wonderful World."
"Summertime."
"Chain Gang."
"Cupid."
"Twistin' the Night Away."
"Having a Party."
"Bring It on Home to Me."
"Another Saturday Night."
"Shake."
"A Change is Gonna Come."
All released on RCA.
     What we have here is a young man who had not only been raised singing in the church; he had become the star of the show with his intensity of spirit in the outstanding Gospel group, The Soul Stirrers. By secularizing (some said betraying) that talent, Sam Cooke moved into the ionosphere with the transcendent “You Send Me.” Never all that far from the showbiz brigands, he continued to record standards such as “Summertime” (done as well as could be, except for Billy Stewart’s superior version), and the topical, though questionable tribute to the Cha Cha. His passion, phrasing, range and specific sound influenced Rod Stewart, Graham Parker, Van Morrison and John Lennon, among others. “Bring It On Hone to Me” is his best song, particularly due to the call and response “yeahs” he does with Lou Rawls.

Arthur Alexander.


Arthur Alexander

"Anna (Go to Him)." Dot. 1962.
"Soldiers of Love." Dot. 1962.
     Most Beatles fans are aware that the Fab Four covered Alexander’s remarkably fine “Anna.” Fewer may know that they also recorded (but while still a legally recording entity did not release)his other soul single, “Soldiers of Love,” a fact made significant because of the way they point up the R&B roots of The Beatles early recordings. In the original versions by Arthur Alexander, both songs croak with a portliness, yet moan with frailty the way few others could, a feat made all the more remarkable considering that he recorded these for a label best known for its soulless cover versions. Johnny Rivers, among lesser talents, learned country-soul phrasing here. “June,” as friends called Arthur, was the first artist to record at Rick Hall’s Fame Studios in Muscle Shoals, Alabama. The money Hall earned from those early recordings enabled him to improve the technical quality of his operations, which in turn gave him the opportunity to record dozens of first-time artists, including a then-young and timid Aretha Franklin.

Solomon Burke"Everybody Needs Somebody to Love." Atlantic. 1964.Solomon Burke

     Anyone holding Barry White in esteem as a good singer will turn stiff as a rocket and passionate as a puma after hearing just the first few seconds of this song. Burke was, however, as shameless as Barry White.
     Burke’s considerable girth was used neither as humor nor as salacious entrée. It simply provided his voice with a powerful resonance. He didn’t sing words; he sang syllables: “Ev er ee bod dee (bu-bum bu-bum) needs sum bod dee.” He didn’t pander. He convinced us we were every bit as lonely and desperate as he claimed to be. Without saying it, the attitude of his voice declared that he had been to the mountaintop, that he had looked over at all his yearning throngs of children, and that they had cried out, begging for the answer. “You people want to know if it’s gonna be alright? Reverend Solomon is here to tell you people it’s gonna be all-right.”

Ben E. King. Ben E. King’s Greatest Hits. Atco. 1964.

     As a member of one incarnation of The Drifters, King sang lead on what many people consider to be the first soul song, “There Goes My Baby.” He achieved even hotter success on his own with priceless times like "I Who Have Nothing,"  "Don't Play That Song" (also well done by Aretha Franklin), "Spanish Harlem" (ditto), and the beatific "Stand By Me." The latter song, with its cricket punctuation and open road at midnight ambiance, brings up from a deepness beyond the singer’s testicles a bargain on a parallel with the kind we might make with God when we are so desperate that nothing else can protect us. In the first verse, King surrenders with something beyond humility to powers greater than himself (be it the force of friendship, the draw of romance, or deal-making with a Deity). Then in the final verse, he makes it clear that the gate to this relationship swings both ways. “If the sky that we look upon should crumble and fall/Or the mountains should tumble to the sea/I won’t be afraid/No I won’t shed a tear/Not as long. . . ” He comes back in the chorus with the clincher: “Whenever you’re in trouble you can stand by me!” Trouble or not, who could refuse such an offer?

Al Green.Al Green

"Tired of Being Alone."
"Call Me."
"I'm Still in Love with You."
"Love and Happiness."
"Let's Stay Together."
"I Can't Get Next to You."
"Look What You Done for Me."
All released on Hi Records.
     The Reverend Al Green is a modern anomaly: an R&B wonder who made it big in the secular world, took his earnings and went back to the church. Like a soaring and raspy-voiced Sam Cooke, Green seduced in every vocal and musical nuance, bleeding proudly in “Tired of Being Alone,” pleading loudly in “Let’s Stay Together,” and even outdoing the Temptations with his version of “I Can’t Get Next to You,” as secularly celebratory a song as ever hit the charts. His other hits rely a bit more on the singer than the song, but one could substitute the Lord for the Girl and the tunes would still be revelatory. In “Let’s Stay Together,” Green sings of his commitment to a woman with the same emotional tenor many Christians use to describe the sensation of being saved. “Call Me” even has the studio equivalent of an Amen Choir humming in full support, devotedly waiting for their time to respond to the Reverend’s call. If anyone ever learned his trade from the angels, it was this man.

Saturday, February 26, 2011

THE PLAYLIST 1: LOTS OF STRINGS AND OTHER THINGS

Lots of Strings and Other Things:
 Early Soul Music



Atlantic Records

Music That Matters

     Soul music of the early 1950s urbanized country blues. Producers rounded off the rough edges of Mississippi Delta explosions and toned down the ferocity of Chicago enthusiasm while redirecting and compacting all that shaved energy into a lulling, harmonic structure—often accentuated by strings—while pushing the vocals far up front. Fast or slow, the best melodies left an impression far beyond their duration. The lyrics, phrased, delicate and deliberate, sang of the vulnerability inherent in the new sophistication. “When this old world starts getting me down/And people are just too much for me to bear,” from the Drifters beatific “Up on the Roof,” is the perfect embodiment of that sensibility. As we are drawn to the city as the host of civilization and as it is drawn to us, the pressures that accompany the pleasures of social interaction threaten to consume the consumers. During those extended moments, the city itself provides its own refuge, a place where refugees can stare off into eternity just long enough for the pressure to subside. Early soul music, more than anything that came before it, promised that we could have it all: the bustle and the calm, the excitement and the solitude, poverty and riches, pain and bliss. 

The Drifters

  To access the rest of this article and also gain complete access to thousands of articles archived on Philropost, simply click the SUBSCRIBE button below. Membership is only $10.00 for a full year.

Thursday, February 24, 2011

A BRIEF HISTORY OF NUCLEAR POWER AND THE BRIEF FUTURE OF EARTH


Philropost


Introduction

     Nuclear power is both simple and complex. Fissioning neutrons produce great heat. Heat placed in water makes steam. Steam accelerates a turbine which in turn powers a generator to make electricity. As a result, people can heat and cool their homes, operate their blow dryers, use their laptop computers, light their rooms at night, and feel safe in their cities. Nuclear power has been harnessed to make devastating bombs that can level cities and states and countries. Nuclear power can only be managed with human assistance and creativity. And yet the limitations inherent in human ingenuity have led to tremendous accidents which have made many sick and others dead. At the same time, oil prices continue to rise while coal mines collapse and workers die, so some people argue that nuclear energy is cheap and safe compared to other forms of power. Still, others have pointed to the problem of storage and disposal of nuclear byproducts, in particular toxic waste, substances that may remain deadly for billions of years. And then there are the so-called alternative energy sources, specifically wind and solar, both heralded by environmentalists while industry attempts to find a way to blend them with commerce. So while the process of nuclear energy is relatively simple, the moral, social, political and economic aspects of this power make it very complex. 


To access the rest of this article and also gain complete access to thousands of articles archived on Philropost, simply click the SUBSCRIBE button below. Membership is only $10.00 for a full year.




Early Nuclear Days

     The world’s first nuclear fission occurred in 1934 when physicist Enrico Fermi irradiated uranium with neutrons. He believed he had produced elements beyond uranium, not realizing that he had in fact split the atom.
     But what is nuclear fission? Atomic theory holds that each atom has at its core a nucleus. Nuclear fission happens when large nuclei break up into two nearly equal fragments. In nuclear reactors and nuclear bombs the newly created neutrons cause other nuclei to fission, thus setting off a chain reaction. When humans are able to control this chain, they have in essence built a nuclear reactor. The heat this produces boils water and the resulting steam is used to generate electricity. But sometimes the chain reaction is not controlled. When this happens, humans will observe an explosion, properly thought of as a bomb.

Bombs
     But how is a nuclear reaction controlled? The fission is actually more slowed down than controlled and this happens when humans add control rods made out of elements such as hafnium, cadmium, or boron, rods which are very good at absorbing neutrons. These control rods minimize neutron fluxes and manage the rates of fission.
     Fermi had fallen into a process that would change the world forever. As a professor of theoretical physics at the University of Rome, he experimented with fission by bombarding different elements with neutrons, in the process unwittingly splitting a uranium atom, an accomplishment that won for him in 1938 the Nobel Prize for Physics. This award was well timed because it allowed Fermi and his wife Laura to escape Rome one step ahead of the Italian Fascists. Once in America, he worked with physicist Leo Szilard to create the first ever artificial atomic reactor, one called an atomic pile. An atomic pile is a type of reactor with a core composed of graphite mixed in with uranium. Their Pile-1 was built beneath the stands at a football stadium at the University of Chicago.


     As sometimes happens in physics, other people were working on similar ideas around this same time. Two of these, expatriate physicists Otto Frisch and Rudolf Peierls, living in the United Kingdom, prepared a theoretical analysis of the possibility of fast fission in Uranium-235. Fast fission is fission caused by neutrons that are in a fast energy range and the process results in an increase in the amount of neutrons in the reactor core. In June 1939 they demonstrated that the fissioning of uranium could create a massive chain reaction. When something called Uranium-235 was used, the result would be a very nasty weapon. Uranium-235 differs from the more plentiful Uranium-238 in that it is less than one percent of all uranium isotopes and also in that it is the only one that can produce enough free neutrons to sustain a chain reaction. The number 235 refers to what scientists call the Atomic Mass of each isotope, or the number of protons added to the number of neutrons, in this case being 92 protons plus 143 neutrons. Because there is so little U-235 and so much demand, U-238 is sometimes enriched to become U-235. This is accomplished by gaseous diffusion, gas centrifuge, or laser separation.
     With British Prime Minister Winston Churchill’s endorsement, the British Chiefs of Staff agreed on September 3, 1941 to begin development on an atomic bomb. But it was not until December 18, after months of bureaucratic struggling and the United States entry into World War II, that a U.S. project to investigate atomic weapons finally got underway.
     Enrico Fermi’s on-going work with graphite and uranium was, in January 1942, transferred to a new secret project, code named Metallurgical Laboratory (Met Lab) at the University of Chicago. In April, Fermi began design of CP-1, the world’s first human- built nuclear reactor. Throughout early and mid 1942, fundamental neutron physics research proceeded, as did work on developing industrial scale processes for producing fissile materials, which are, by definition, able to sustain a chain reaction of nuclear fission. This means that they can be used to fuel a thermal reactor, a fast reactor, or a nuclear explosive. But it became increasingly obvious that since this was to be an industrial scale project, a proven project manager was needed. Furthermore, since it was a weapons project, it needed to be brought under an organization experienced in producing weapons.
     On June 18, 1942, Brigadier General Steyr (a member of the Military Policy Committee) ordered Colonel James Marshall to organize an Army Corps of Engineers District to take over and consolidate atomic bomb development. During August of that year, Marshall created a new District Organization with the intentionally misleading name “Manhattan Engineer District” (MED), now commonly called The Manhattan Project. 




World War II and Arms Build-Ups

     The nuclear age officially began on July 16, 1945 when the United States exploded the first nuclear bomb, code named “Trinity,” at Alamogordo, New Mexico. Today the Alamogordo Chamber of Commerce boasts that the state’s southeastern city is the gateway to the Land of Enchantment, a Space Museum, and a superb July 4th celebration. But years earlier it was the site of the first test of a nuclear bomb. Theoretical physicist Robert Oppenheimer called it the Trinity Test, a name inspired by poet John Donne’s Holy Sonnet 12.

Father, part of his double interest Unto thy kingdom, thy Son gives to me, His jointure in the knotty Trinity He keeps, and gives me his death’s conquest. This Lamb, whose death, with life the world hath blessed, Was from the world’s beginning slain, and he Hath made two wills, which with the legacy Of his and thy kingdom, do thy sons invest. Yet such are thy laws, that men argue yet Whether a man those statutes can fulfill; None doth, but thy all-healing grace and Spirit Revive again what law and letter kill. Thy law’s abridgement, and thy last command Is all but love; oh let that last will stand!

     Robert Oppenheimer came to prominence once news came to President Franklin Roosevelt that the German Nazis had split the atom and hence were quickly developing a nuclear weapon. Roosevelt directed Oppenheimer to head what came to be known as The Manhattan Project. 

     The site chosen for the Trinity Test was a remote corner on the Alamogordo Bombing Range known as the Jornada del Muerto, or Journey of Death, 210 miles south of Los Alamos. While Manhattan Project staff members watched, the device exploded across the New Mexico desert, vaporizing the tower and turning the asphalt around the base of the tower into green sand. Seconds after the explosion came, a huge blast wave and heat seared out across the desert. The shock wave broke windows more than one hundred miles away. As the orange and yellow fireball stretched up and spread, a second column, narrower than the first, rose and flattened into a giant mushroom shape, thus providing the atomic age with a visual image that has become imprinted on the human consciousness as a symbol of power and awesome destruction.
Atomic bomb
     The Manhattan Project operated under several code names, including S-1, but whatever its name, it was launched under FDR and did indeed result in the development of the Atomic Bomb. When Roosevelt died in April, 1945, Harry Truman became President. He was, of course, informed of the existence of the Project and of the capability of the United States to use the Bomb. Robert Wilson, a member of the Manhattan Project, advised the new President that the World War II Allies should proudly demonstrate the power of the Bomb.
     Not everyone agreed. James Franck, another member of the Manhattan Project, told the President: “The military advantages and the saving of American lives achieved by the sudden use of atomic bombs against Japan may be outweighed by the ensuing loss of confidence and by a wave of horror and repulsion sweeping over the rest of the world and perhaps even dividing public opinion at home.”
     General Dwight Eisenhower, the Supreme Allied Commander, concurred. He later said, “I voiced [to Truman] my grave misgivings, on the basis of my belief that Japan was already defeated and that dropping the Bomb was completely unnecessary.”
     On August 6, 1945, a B-29 long-range heavy bomber took off from the island of Tinian (infamous as the first place where napalm was ever used) and headed north by northwest toward Japan. The aircraft was called the Enola Gay, and had been named after the mother of pilot Lieutenant Colonel Paul Warfield Tibbets. The bomber’s primary target was the city of Hiroshima, located on the deltas of southwestern Honshu Island facing the Inland Sea. Hiroshima had a civilian population of almost 300,000 and was an important military center, containing about 43,000 Japanese soldiers.
     At 8:15a.m. Hiroshima time, the Enola Gay released Little Boy, the 9,700-pound uranium bomb, over the city. Forty-three seconds later a huge explosion lit the morning sky as Little Boy detonated 1,900 feet above the city, directly over a parade field where soldiers of the Japanese Second Army were doing calisthenics. Though already eleven-and-a-half miles away, the Enola Gay was rocked by the blast. Some 70,000 people died as a result of the initial explosion, heat, and radiation. This included twenty American airmen being held as prisoners in the city. By the end of the year, the Hiroshima death toll was over 100,000. The five-year death toll exceeded 200,000, as cancer and other long-term effects took hold.
    The next break in the weather over Japan was due to appear three days after the bombing of Hiroshima, to be followed by five days of prohibitive weather. The plutonium bomb, nicknamed “Fat Man,” was rushed into readiness to take advantage of this window of opportunity. A B-29 named Bock’s Car lifted off from Tinian and headed towards the primary target of Kokura Arsenal, a massive collection of war industries adjacent to the city Koura. At 11:02 a.m., at an altitude of 1,650 feet, Fat Man exploded over the city of Nagasaki. The yield of the explosion was later estimated at 21 kilotons, forty percent greater than that of the Hiroshima bomb. Although the destruction at Nagasaki received less worldwide attention than that at Hiroshima, it was nonetheless extensively deadly. Everything up to half a mile from ground zero was completely destroyed, including even the earthquake-hardened concrete structures that had sometimes survived at comparable distances at Hiroshima.
    The same Robert Wilson who had earlier advised Truman to show the power of the bomb to the Japanese was quick to reconsider. After the devastation at Hiroshima and Nagasaki, the Manhattan Project engineer announced:
"A specter is haunting this country—the specter of nuclear energy. As a scientist who worked on the atomic bomb, I am appalled that the public is so apathetic and so uninformed about the dangerous social consequences of our development. There is no secret of the atomic bomb. In my opinion, in two to five years other countries can also manufacture bombs, and bombs tens, hundreds, or even thousands of times more effective than those which produced such devastation at Hiroshima and Nagasaki. This country with its concentrated industrial centers is entirely vulnerable to such weapons; nor can we count on, or even expect, effective counter-measures. Unless strong action is taken within the near future toward a positive control, this country will be drawn into an armament race which will inevitably end in catastrophe for all participants. . . . It is the responsibility of the press to stimulate public discussion on this vital matter and to educate the people as rapidly as possible. Where security permits, my colleagues are eager to help with scientific information. It was our hope in developing the bomb that it would be a great force for world cooperation and peace."
        As to proliferation, he was certainly correct. On August 29, 1949, the Soviet Union became the second country to test the explosion of an atomic bomb. After the destruction of Hiroshima and Nagasaki, the Soviets’ atomic bomb program shifted into high gear. The USSR began construction of a near copy of the Fat Man, using the detailed design descriptions presumably provided by Klaus Fuchs. This replica, which the western nations named Joe-1, was detonated at the Semipalatinsk Test Site in Kazakhstan. Its estimated yield was about 22 kilotons. The Russians called it First Lightning, but the West named it after The Soviet General Secretary, Joseph Stalin. 

     At the outbreak of World War II, Karl Fuchs, being a German citizen, was interned in a camp in Quebec, Canada. However, Professor Max Born of Edinburgh University intervened on his behalf, and by early 1941, Fuchs had returned to Edinburgh, where he was approached by Rudolf Peierls to work on the British atomic bomb research project. He became a British citizen in 1942.
     The following year, Fuchs was among the British scientists sent to the U.S. to collaborate on the atom bomb. He was sent to the weapons laboratory in Los Alamos, New Mexico, where he worked in the theoretical division. His chief area of expertise was the problem of imploding the fissionable core of the plutonium bomb. He was present at the Trinity Test.
     Fuchs later testified that he passed detailed information on the project to the Soviet Union through a courier in 1945, and further information about the hydrogen bomb in 1946 and 1947. But it was not until 1948 that it was discovered that the Manhattan Project security had been breached, and not until 1949, when Fuchs had returned to England and the Harwell Atomic Energy Research Establishment, that he was confronted by intelligence officers as a result of the cracking of Soviet ciphers. Fuchs confessed in January 1950 and was convicted on March 1, 1950, and sentenced to 14 years in prison. His testimony to British and American intelligence agencies eventually led to the trials of David Greenglass and Julius and Ethel Rosenberg in the U.S.


You Light Up My Bulbs

     On December 20, 1951, in the first instance of using nuclear power to produce electricity, the National Reactor Testing Station at Idaho Falls lit four of its own light bulbs by employing their Experimental Breeder Reactor-1. The following day it generated enough power to illuminate the entire EBR-1 facility. The reactor not only pioneered the atomic production of electricity, it also demonstrated that a reactor could generate more atomic fuel than it consumed. EBR-1 did this by bombarding uranium base material with excess neutrons that otherwise would have been absorbed by shielding. That turned enough of the uranium into plutonium, also a reactor fuel, to more than compensate for the fuel EBR-I burned up.



The Mike Test

     The United States detonated a 10.4-megaton hydrogen device on the Enewetak Atoll in the Pacific Ocean’s Marshall Islands on November 1, 1952. The test, code named “Mike,” was the first successful implementation of Edward Teller and Stanislaw Ulam’s concept for a Super Bomb. Even those who witnessed previous atomic detonations were stunned by the blast of The Mike Test. The mushroom cloud, when it reached its furthest extent, stretched 100 miles wide and 25 miles high. The explosion evaporated Elugelab Island, leaving behind a crater more than a mile wide. The blast destroyed life on the surrounding islands.
     It may be helpful to try to understand something about the men behind this test. One of the key enthusiasts was the aforementioned Edward Teller. Of all the scientists who worked on the U.S. nuclear weapons program, none were more controversial than he. Described by one Nobel Prize winner in physics as “one of the most thoughtful statesmen of science,” and by another as “a danger to all that’s important,” Teller was recognized by most of his colleagues as being one of the most imaginative and creative physicists alive. But at the same time, his single-minded pursuit of the hydrogen bomb and his autocratic style alienated many of the scientists with whom he worked.
     After World War II, Teller left the Manhattan Project and returned to his teaching job at the University of Chicago. But once the Soviet Union conducted its test of an atomic device, he did his best to drum up support for a crash program to build a hydrogen bomb. Teller argued that a Super Bomb was essential to the very survival of the United States. He said: “If the Russians demonstrate a Super before we possess one, our situation will be hopeless.” President Truman agreed. When Teller and mathematician Stanislaw Ulam finally developed a Hydrogen Bomb design that would work, Teller was not chosen to head the project. He left Los Alamos and soon joined the newly established Lawrence Livermore Laboratory, a rival weapons facility in California.
      But back to the Super Bomb. Mike was incredibly large. In 1952, the smallest atomic bomb with enough explosive force to set off a fusion reaction was almost four feet in diameter. The actual casing for the “Mike” gadget would end up being twenty feet long. According to one of the scientists who worked on the project, a full-scale drawing of the device became essential for everyone on the team to communicate effectively with each other. The drawing was so big that a balcony had to be built from which to view it. 

Philropost
     Physicist Herbert York summed up the implications of the first test of a thermonuclear device: “The world suddenly shifted from the path it had been on to a more dangerous one. Fission bombs, destructive as they might have been, were thought of [as] being limited in power. Now, it seemed we had learned how to brush even these limits aside and to build bombs whose power was boundless.”
     Since the days of Mike, the United States has carried out 1,030 nuclear weapons tests, the last one on September 23, 1993. The Soviet Union administered 715 such tests, the last one on October 25, 1990. France comes in third with 210 tests, its last one occurring on January 27, 1996. Britain has detonated 45 test bombs, their final one on November 26, 1991. And China rounds out the current list with 43 explosions.


Chalk River Disaster

      Not all nuclear disasters were the results of uncontrolled nuclear fission. Some were the consequence of “controlled” fission. On December 12, 1952, millions of gallons of radioactive water accumulated inside the nuclear reactor at Chalk River, near Ottawa, Canada, the result of a partial meltdown of the reactor’s fuel core. Future U.S. President Jimmy Carter, then a Navy officer, was part of the clean-up crew. 

     Gordon Edwards of the Canadian Coalition for Nuclear Responsibility describes what happened. 
"The first of two accidents in the 1950s occurred in 1952, when the NRX reactor underwent a violent power excursion that destroyed the core of the reactor, causing some fuel melting. Unaccountably, the shut-off rods failed to fully descend into the core. A series of hydrogen gas explosions hurled the four-ton gasholder dome four feet through the air where it jammed in the superstructure [Emphasis added]. Thousands of curies of fission products were released into the atmosphere and a million gallons of radioactively contaminated water had to be pumped out of the basement and “disposed of” in shallow trenches not far from the Ottawa River. The core of the NRX reactor could not be decontaminated; it had to be buried as radioactive waste. Five years later, in 1958, several metallic uranium fuel rods in the NRU reactor overheated and ruptured inside the reactor core. One of the damaged rods caught fire and was torn in two as it was being removed from the core by a robotic crane. As the remote-controlled crane passed overhead, carrying the larger portion of the damaged rod, a three-foot length of fiercely burning uranium fuel broke off and fell into a shallow maintenance pit. The burning fuel lay there, spreading deadly fission products and alpha-emitting particles throughout the reactor building. The ventilation system was jammed in the open position, thereby contaminating the accessible areas of the building as well as a sizable area downwind from the reactor site. A relay team of scientists and technicians eventually extinguished the fire by running past the maintenance pit at top speed wearing full protective gear, dumping buckets of wet sand on the burning uranium fuel."

     For the people inside the Chalk River Plant, the world must have been coming to an end. A dome jumps through the air, radioactive water has to be pumped into trenches, the reactor has to be cased and buried, fuel is burning everywhere, spreading gaseous poison, and a bunch of guys in yellow protective Demron suits are spreading beach sand over fires that don’t want to go out. And unseen but real radioactivity shooting out at twice the speed of light embeds itself in everything it can.

Edwards went on to discuss the clean-up procedures. "Over a thousand men were involved in the cleanup operations following these two accidents. More than 600 men were required for the NRU cleanup alone. Official AECL reports stress that very few of these men were over-exposed to radiation— that is, most of the recorded radiation doses did not exceed the levels that were considered permissible for atomic workers at that time. The reports also imply that no adverse health effects were caused by the exposures received. However, no medical follow-up has ever been done to see whether the population of men involved exhibited a higher-than-normal incidence of cancer later in life."
     The current reactor at Chalk River, in addition to one in the Netherlands, produces about 90% of material for nuclear medicine in the world. This is significant because on November 18, 2007, the National Research Universal Reactor (NRU), which makes medical radioisotopes, was shut down for routine maintenance. This shutdown was extended when the Atomic Energy for Canada Limited or AECL (presumably a Canadian version of the Atomic Energy Commission), in consultation with the Canadian Nuclear Safety Commission (CNSC), decided to connect seismically-qualified emergency power supplies (EPS) to two of the reactor's cooling pumps (in addition to the AC and DC backup power systems already in place), which had been required as part of its August 2006 operating license issued by the CNSC. This resulted in a worldwide shortage of radioisotopes for medical treatments because Chalk River makes the majority of the world’s supply. On December 11, 2007, the Canadian House of Commons, acting on independent expert advice, passed emergency legislation authorizing the restarting of the NRU reactor and its operation for 120 days (counter to the decision of the CNSC), which was passed by the Parliament and received Royal Assent on the next day. Prime Minister Stephen Harper accused the “Liberal-appointed” CNSC for the shutdown which, he claimed, jeopardized the health and safety of tens of thousands of Canadians, insisting that there was no risk.

Love Canal


     In 1953, waste from chemical plants destroyed the New York town of Love Canal. The village remained uninhabitable for the next forty years. How did such a travesty of sense, logic and morality come about?
     Hooker Chemical and Plastics Corporation acquired Love Canal for its own private use in 1947 and buried 21,000 tons of toxic waste there over the next five years. After the site couldn’t hold any more, Hooker filled in the canal. Love Canal is near Niagara Falls, New York. During these years, Niagara’s population was growing and the city was desperate for new land. The city of Niagara bought the Love Canal for one dollar. The subsequent construction of a school punctured a copper barrier Hooker Chemical had used to contain the toxic waste. The danger of toxic waste can last for billions of years, so a leak such as this is extremely important to address. 


     The Niagara and Love Canal population reported health problems and strange odors over the next years, but it was not until the president of the Love Canal Homebuilders Association, one Lois Gibbs, investigated the situation that the severity was realized. The homeowners, many of them extremely sick, had to fight Hooker Chemical and the U.S. Government and only received financial compensation for having to relocate in 1978 when then President Carter declared the site a Federal Emergency Area.
     The government did what governments do when they hope to legitimize their own cowardice. They ordered scientists to investigate. Those scientists determined that the atomic plant dumped chemicals that seeped into basements and the air and were responsible for the sickness of the residents. Over 800 families relocated and the Environmental Protection Agency sued Hooker’s parent company, Occidental Petroleum, for $129 million.

Eureka! Obninsk! Electricity!

     The world’s first nuclear power plant generated electricity in Obninsk in the Soviet Union on June 27, 1954. The capacity was only five megawatts, small by today’s standards, with most reactors now exceeding 1,000 megawatts. This power plant was shut down in May, 2002 because, as Russian Mayak Radio reported, its further operation became pointless. The radio station reported that the reactor had come to the end of its life after almost fifty years in operation.
     The world’s first nuclear-powered submarine, the Nautilus, was launched in January, 1955. Nautilus' nuclear generator allowed it to dive longer, faster, and deeper than any submarine before it. Nautilus continued to break records in 1958 by becoming the first vessel to cross the North Pole. Decommissioned in 1980, the submarine was converted into a museum in 1985.


Government Joins Industry, Industry Always Wins

    In 1955 the Atomic Energy Commission announced the beginning of a cooperative program between government and industry to develop nuclear power plants. Arco, Idaho (population 1,000) became the first U.S. town powered by nuclear energy. An experimental reactor, BORAX III, provided energy for the first U.S. nuke town. The power was generated at the Idaho National Energy Laboratory.
     This same year the United Kingdom announced its decision to develop thermonuclear weapons. A few months later, the United Nations sponsored the first international conference on what they termed the “peaceful” uses of nuclear energy, a decision reached and announced in Geneva, Switzerland.
     This may be a good place to point out the different types of reactors in existence.
    The first type is the Pressurized Water Reactor. In the PWR the water which passes over the reactor core to act as moderator and coolant does not flow to the turbine, but is contained in a pressurized primary loop. The primary loop water produces steam in the secondary loop which drives the turbine. Put simply, water gets hot, converts to steam, and the steam powers the turbine which in turn generates electricity. The obvious advantage to this is that a fuel leak in the core would not pass any radioactive contaminants to the turbine and condenser.
     Second is the Boiling Water Reactor. In the BWR, the water which passes over the reactor core to act as moderator and coolant is also the steam source for the turbine. While this seems more efficient, the disadvantage is that any fuel leak might make the water radioactive and that radioactivity would reach the turbine and the rest of the loop.
     In the Pressurized Water Reactor, the water which flows through the reactor core is isolated from the turbine. But the Gas-Cooled Reactor (GFR) system features a fast-neutron-spectrum, helium-cooled reactor and closed fuel cycle. The GFR uses a direct-cycle helium turbine for electricity generation, or can optionally use its process heat for production of hydrogen. Through the combination of a fast spectrum and full recycle of actinides (such as Uranium), the GFR minimizes the production of long-lived radioactive waste. The GFR’s fast spectrum also makes it possible to use available fissile and fertile materials (including depleted uranium) much more efficiently than thermal spectrum gas reactors with once-through fuel cycles. Several fuel forms are candidates that hold the potential for operating at very high temperatures and to ensure an excellent retention of fission products: composite ceramic fuel, advanced fuel particles, or ceramic-clad elements of actinide compounds. Core configurations may be based on pin-or plate-based assemblies or on prismatic blocks. The GFR reference has an integrated, on-site spent fuel treatment and refabrication plant.
     Russia holds an unintentional monopoly on the Light Water Graphite Reactor. The Soviet designed RBMK is a pressurized water reactor with individual fuel channels which uses ordinary water as its coolant and graphite as its moderator. It is very different from most other power reactor designs in that it was intended and used for production of both plutonium and power. The combination of graphite moderator and water coolant is found in no other power reactors. The design characteristics of the reactor were shown, in the Chernobyl accident, to cause instability when at low power. This was due primarily to control rod design and a positive void coefficient.
     The Fast Neutron Reactor, the final major type, more deliberately use the uranium-238 as well as the fissile U-235 isotope used in most reactors. If they are designed to produce more plutonium than they consume, they are called Fast Breeder Reactors (FBR). But many designs are net consumers of fissile material, including plutonium. Fast neutron reactors also can burn long-lived actinides which are recovered from used fuel out of ordinary reactors.



Accidents Will Happen

     What constitutes a nuclear accident has never been agreed upon, not even by nuclear physicists or the various Atomic Energy agencies throughout the world. But we take “accident” to mean that something dangerous happened that was not planned. With that view in mind, here is an incomplete list—incomplete partly because not all accidents are likely to have been reported and partly because a list of all accidents would require a separate book. We focus here on the most significant or horrendous events.

     A container of uranium hexafluoride exploded on September 2, 1944, in the Oak Ridge transfer room, killing Peter N. Bragg, Jr., and Douglas P. Meigs and injuring three others. A steam pipe exploded and the incoming water vapor combined with the uranium compound to form hydrogen fluoride, a dangerous acid, which all five inhaled. Bragg and Meigs died from whole-body acid burns.
     Shortly after the Hiroshima and Nagasaki bombs were detonated in August 1945, Harry K. Daghlian, Jr., working at the Los Alamos Omega Site, accidentally created a supercritical mass when he dropped a tungsten carbide brick onto a plutonium core. He removed the piece, but was fatally irradiated in the incident.
     It was in November, 1950, when a B-50, returning one of several U.S. Mark IV bombs secretly deployed in Canada, developed engine trouble and jettisoned the weapon at 10,500 feet. The bomb, which carried some uranium but not its plutonium core, was set to self-destruct at 2,500 feet and dropped over the St. Lawrence River off Rivre du Loup, Quebec. The explosion shook area residents and scattered nearly 100 pounds of uranium.
     During the early morning of March 1, 1954, a Japanese fishing boat, the Fukuryu Maru, or Lucky Dragon, and its crew witnessed what they thought was the sun rising to the west of them as they sailed in the Pacific Ocean. That struck the more alert of them as unlikely, what with the sun being in the habit of rising in the east. What they were actually seeing was the 12 Megaton detonation of the Hydrogen “Bravo” Bomb at the Bikini Atoll, eighty-five miles away. Several hours later, white ash began to fall like snow onto their ship. Many of the crew members, thinking they had come upon nonmelting snowflakes, began gathering the ash into bags as souvenirs. Before the actual sun set, the entire crew had fallen ill. (The 86 residents of Rongelap Atoll had similar experiences from their own deadly snow.) The twenty-three crew members were hospitalized in Japan and one later died of kidney failure due to being exposed to radiation. Not surprisingly, the incident caused a rift in relations between Japan and the United States because the U.S. did not warn Japan or any other country of the bomb’s testing, leaving the Lucky Dragon exposed to the fallout. The U.S. issued an apology and paid $2 million in compensation. The twenty-three crewmen were among 264 people accidentally exposed to radiation because the explosion and fall-out had been far greater than expected. The original natives were granted $325,000 in compensation and returned to Bikini in 1974 from which they were again evacuated four years later when new tests showed high levels of residual radioactivity in the region. Twenty-three nuclear tests were carried out at Bikini between 1946 and 1958. 

    It was in 1957 in the South Ural Mountains of the USSR that radioactive waste exploded at a Soviet nuclear weapons factory, resulting in the evacuation of 10,000 people. Soviet officials claimed there were no casualties. However, the Russian scientist who reported the accident said that hundreds of people died from radiation sickness. A series of less prominent accidents preceded and followed this meltdown, in addition to a polluted water supply for people remaining in the area. More than 500,000 inhabitants of the region were exposed to radiation as a result.
     In October of that same year, a fire in a graphite-cooled reactor north of Liverpool, England, sent radiation clouds into the countryside, contaminating a 200 square mile area.
     The world’s first official fatal atomic accident happened on January 3, 1961, when a small experimental BWR called SL-1 (Stationary Low-Power Plant Number 1) in Idaho Falls blew up after a control rod was manually removed. At the time of this accident, a three-man crew was on top of the reactor where they were assembling the control rod drive mechanisms and housing. The nuclear excursion, which resulted in an explosion, was caused by manual withdrawal of the central control rod blade from the core beyond the limits specified in the maintenance procedures. Two crewmen died instantly from the force of the explosion. The third man died two hours later from a head injury. Twenty-two of the people engaged in recovery operations received radiation exposure. Some gaseous fission products, including radioactive iodine, escaped into the atmosphere and drifted downwind in a thin plume. Particulate fission material was largely confined to the reactor building with slight radioactivity in the immediate vicinity of the building.
     A vessel called the Scorpion was the Soviet Union’s first nuclear-powered submarine. When a pipe in the control system of one of the reactors ruptured, radiation spread through the sea craft, killing the captain and seven crew members. Folksinger Phil Ochs wrote and recorded a moving song about the event.
     When a sodium cooling system failed on October 5, 1966, the core of a reactor partially melted. The Fermi 1 was a breeder reactor located at Lagoona Beach, thirty miles from Detroit. High temperatures were measured (700 Fahrenheit compared to normal 580 Fahrenheit) and radiation alarms sounded involving two fuel rod subassemblies. The reactor scrammed (shutdown) and fuel melted. Had the super-heated fuel hit the water table, a hydrogen bubble would have arisen and spread toxic gas over much of southern Michigan and northern Ohio. As singer Gil Scott-Heron put it, “We almost lost Detroit.”
     After a month of squirming, the officials tested out enough subassemblies to limit the damage to six of them. By January 1967 they learned that four subassemblies were damaged with two stuck together, but it took four more months to remove them. When they had checked the sodium flow earlier, they had detected a clapping noise. In August 1967, technicians were able to lower a periscope device into the meltdown pan and found that a piece of zirconium cladding had come loose and was blocking the sodium coolant nozzles. The zirconium cladding was part of the lining of the meltdown cone designed to direct the distribution of fuel material should a meltdown of the fuel occur. Such structures are necessary in a breeder reactor because of the possibility or, in fact, likelihood of molten fuel reassembling itself in a critical configuration. This is not a possibility in an ordinary light water reactor because of the low level of enrichment of the uranium, but a fast breeder reactor is operated with a much higher level of enrichment. The phrase “China syndrome” was coined in regard to this accident as government and industry were contemplating the possibilities should a meltdown of fuel with critical reassembly take place. The uncontrolled fission reaction could create enough heat to melt its way into the earth, and some engineer reported quipped, “It could go all the way to China.”
     With tools designed and built for the purpose, the piece of zirconium was fished out in April of 1968. In May 1970, the reactor was ready to resume operation, but a sodium explosion delayed it until July of that year. In October it finally reached a level of 200 Megawatts. The total cost of the repair was $132 million. In August, 1972, upon denial of the extension of its operating license by the Atomic Energy Commission, the plant was permanently shut down.
     Before the end of the 1960s, a reactor coolant malfunctioned at the Lucens Vad, Switzerland plant and an undisclosed amount of radiation escaped into a cave, one which was quickly sealed. Switzerland had by this time established itself as a major developer of nuclear energy once the first commercial plants were opened in the cities of Beznau and Mühleberg. But disaster struck the reactor as disasters often do. This one hit on January 21, 1969. A pressure tube burst, creating a power surge, and the reactor malfunctioned. Some radioactive gas escaped from the cavern and the reactor had to be shut down.
     The Swiss government ordered an inquiry into the incident and a report was published right away, if by “right away” you mean ten years later. The result? The inquiry by the Swiss Association for Atomic Energy found there had been no major negligence on the part of the plant’s managers. It blamed the blast on a corroded pressure tube, which had been caused by humidity.
     The reactor core in the Lubmin Plant in what was on December 7, 1975 known as East Germany came very close to melting down as the result of a fire caused by an electric short circuit.
     But it was on March 28, 1979 that there occurred an accident that everyone of a certain age will always remember. The nuclear power plant accident happened at Three Mile Island, located near Harrisburg, Pennsylvania. What happened in the early morning hours of this day was the worst reported nuclear plant crisis in United States history so far. Due to equipment failure and operator error, Three Mile Island experienced a partial nuclear core meltdown at the Unit’s Number Two Reactor. A meltdown occurs when the reactor core burns through its case and begins a descent down into the Earth. Once it reaches the water table, two things will happen. First, an enormous hydrogen bubble will rise up through the freshly-bored path and into the atmosphere, a concern of no small magnitude since hydrogen is extremely flammable. Second, the reactor hitting the water table will create a radioactive geyser 10,000 times the size of Old Faithful in Yellowstone National Park.
     Twenty-four hours later, no one had been able to cool the reactor. The meltdown continued. Forty-eight hours later, the plant owner, Metropolitan Edison, made the decision to vent radioactive materials into the atmosphere to reduce the pressure the reactor was under and hence cool the core to prevent the China Syndrome from happening. The Nuclear Regulatory Commission, an agency as pro-nuclear power as they come, wrote up the situation this way: “By the evening of March 28, the core appeared to be adequately cooled and the reactor appeared to be stable. But new concerns arose by the morning of Friday, March 30. A significant release of radiation from the plant’s auxiliary building, performed to relieve pressure on the primary system and avoid curtailing the flow of coolant to the core, caused a great deal of confusion and consternation.” By the words “confusion” and “consternation” the NRC meant that people thought they were going to die. Even Pennsylvania’s Governor advised all children and pregnant women within a five mile radius of the plant to evacuate. Over that weekend, 140,000 central Pennsylvania residents evacuated. By the first of the week, President Jimmy Carter, himself a nuclear engineer, toured the facility in glowing yellow boots in an effort to convince people that it was safe to return.
     According to the NRC’s own report: “It was later found that about one-half of the core melted during the early stages of the accident. Although the TMI-2 plant suffered a severe core meltdown, the most dangerous kind of nuclear power accident, it did not produce the worst-case consequences that reactor experts had long feared. In a worst-case accident, the melting of nuclear fuel would lead to a breach of the walls of the containment building and release massive quantities of radiation to the environment.”
     “Breach” is one of those words government agencies love to use because this is a word that does not convey much of a mental image, which is why we heard it used so often during the New Orleans flooding after Hurricane Katrina. In the case of Three Mile Island, the word “breach” meant that the walls would rattle apart and collapse, allowing for an ungodly amount of radioactive gases to spray up into the air and contaminate the surrounding planet.
     And yet the disaster was not quite finished. The operation to rid the Three Mile Island Nuclear Power Plant of radioactive material was not completed until 1988, almost ten years after the installation was crippled by the world’s first major nuclear accident. The clean-up operation was suspended while the Nuclear Regulatory Commission and other government organizations prepared environmental impact “studies.” The NRC said it did not regard Three Mile Island as a safe waste disposal site. “Removing the damaged fuel and radioactive waste to suitable storage sites is the only reliable means of eliminating the risk of widespread contamination,” the Commission report said.
     Also in the United States, on February 11, 1981, 100,000 gallons of radioactive coolant leaked into the containment building of the Tennessee Valley Authority’s Sequoyah Plant, contaminating eight workers. The accident occurred while the plant was shut down for maintenance. The plant used slightly contaminated water for emergency coolant, because clear water would raise costs “needlessly.”
     Half a world away, in Bhopal, India, a Union Carbide plant leaked a toxic gas known as methyl isocyanate, killing 2,000 people and contaminating 150,000 others on December 3, 1984. The Indian government estimated that some 50,000 people were treated in the first days after this horrible accident. They were suffering from terrible side effects, including blindness, kidney failure, and liver disease. Since then, researchers have said that nearly 20,000 others have died from the effects of the leak. Investigations into the disaster revealed that something had malfunctioned with a tank that stored the lethal methyl isocyanate. In 1989, Union Carbide, now a subsidiary of Dow Chemical, paid the Indian Government 470 million pounds in a settlement which many described as woefully inadequate. But in 1999 a volunteer group in Bhopal, believing that not enough had been done to help victims, filed a lawsuit in the United States asserting that Union Carbide violated international law and human rights. Three years later the government of India announced it was seeking the extradition from the United States of former Union Carbide boss Warren Anderson. CEO Anderson faced charges of “culpable homicide” for cost-cutting at the plant which was alleged to have compromised safety standards. In October 2004, the Indian Supreme Court approved a compensation plan drawn up by the State Welfare Commission to pay nearly $350 million to more than 570,000 victims of the disaster. 

    The world’s largest reported nuclear accident occurred a little after 1:00 a.m. on April 26, 1986 at the Chernobyl facility in the Ukraine. Thirty people died the first day. An area of twenty square kilometers remains permanently quarantined. Russian scientists estimated that all of the xenon gas, about half of the iodine and cesium, and at least five percent of the remaining radioactive material in the Chernobyl 4 reactor core (which had 192 tons of fuel) was released in the accident. Most of the released material was deposited close by as dust and debris, but the lighter material was carried by wind over the Ukraine, Belarus, Russia and to some extent over Scandinavia and Europe.

     The casualties included firefighters who attended the initial fires on the roof of the turbine building. All these were put out in a few hours, but radiation doses on the first day were estimated to range up to 20,000 millisieverts, causing twenty-eight more casualties by the end of July 1986. By the time it was presumed to be all over, half a million people died as a result of this nuclear accident. Since the meltdown occurred, a nineteen mile radius of the plant has been closed off to the public. But sometime in 2011 the area is scheduled to be a tourist attraction, open to the public. Those visitors who survive will be rocketed off to Saturn to see if they can also exist without oxygen.
     The accidents, meanwhile, just kept on clicking.

  • September 18, 1987: At Goiania, Brazil, 244 people were contaminated with Cesium-137 from a cancer-therapy machine that had been sold as scrap. Cesium-137 is one of the radionuclides yielded by nuclear reactions. Four people died. 
  • March 24, 1992: Radioactive iodine escaped from the Sosnovy reactor near St. Petersburg, Russia. Radioactive iodine can cause thyroid problems. Long-term or chronic exposure to radioactive iodine can cause nodules or cancer of the thyroid. 
  • November, 1992: France suffered its worst nuclear accident when three workers were contaminated once they entered the particle accelerator in Forbach. Company executives went to prison for neglecting proper safety measures. 
  • November, 1995: In Monju, Japan, a prototype fast-breeder reactor leaked two tons of sodium from the cooling system. The basic problem with sodium cooled reactors like the Liquid Metal Fast Breeder Reactor is the safety problem inherent in the use of sodium as a coolant. Sodium reacts chemically with both air and water, and will burn strongly with either. Hence sodium leaks become a significant issue with sodium cooled reactors. The history of sodium cooled reactors give scant comfort to those who argue that they are safe. 
  • March, 1997: In the first of two accidents at Tokaimura, Japan, a fire and explosion contaminated thirty-five workers with radiation. 
  • September 30, 1999: A second accident at Tokaimura’s uranium processing plant exposed fifty-five workers to radiation and led the government to order 300,000 people to stay indoors. Two of the plant’s workers died. 
  • May, 2003: Cesium-137 settled on trees in Siberia, Alaska and northern Canada. During that summer, these trees are burned as part of routine forest fires. A monitoring device at the Canadian Arctic detected the Cesium, which was produced decades earlier during nuclear weapons testing. 
  • August 9, 2004: Non-radioactive steam leaked from a nuclear power plant in Mihama, Japan, killing four workers. 

Progress and Punishment

       Chemical technician and union activist Karen Silkwood was killed on November 13, 1974. She left a union meeting at the Hub Café in Crescent, Oklahoma. Another attendee of that meeting later testified that Silkwood had a binder and a packet of documents at the café. These were not found at the crime scene. Silkwood got into her car and headed alone for Oklahoma City, about thirty miles away, to meet with New York Times reporter David Burnham and Steve Wodka, an official of her union’s national office. It has been postulated that Silkwood’s car was rammed from behind by another vehicle and with the intent to cause an accident that would result in her death. Skid marks from Silkwood’s car were present on the road, prompting some to suggest that she was desperately trying to get back onto the road after being pushed from behind. 
     Silkwood had begun carrying around notebooks to document a variety of safety violations at the plant. Her claim was that people were being contaminated by plutonium all the time. Indeed, there were at least seventeen acknowledged incidents of exposure involving seventy-seven employees of Kerr-McGhee in the then recent past.


  • September 29, 2003: India announced it would begin a seven-year construction of an advanced heavy water reactor, one which would yield more uranium than it consumes.
  • July 31, 2005: The Palo Verde Unit 2 near Phoenix, Arizona, becomes, in terms of output capacity, the largest reactor site in the United States. 

The Future of the Earth

     To understand the future of Earth, it may be helpful to examine the use of energy in many of our planet’s countries. The use of electricity is of course a major concern. According to a fascinating series of documents known as the CIA World FactBook, the top five countries in terms of electric power usage or consumption are The United States, China, The European Union, Russia, and Japan. These same five countries also lead the world in production of electricity (and in the same order). Oil consumption is also an enormous concern, in part because of its contribution to the fact of global warming and also because its presence is finite, which is to say it is nonrenewable, cost-prohibitive, and of an uncertain and far from limitless quantity. The top five consuming countries, according to the same source, are The United States, The European Union, China, Japan, and India. In the U.S., for example, eighteen million barrels of oil are used every day. By contrast, the top five oil producing countries, in order of most to least, are Russia, Saudi Arabia, United States, Iran, and China. Solar power, which is renewable, is a very different picture. In 1997, the United States was responsible for forty percent of the world’s solar energy production. But by 2007 the number had dropped to eight percent. By 2009, solar and wind incentives to energy corporations had been completely eliminated by the U.S. Government (thanks in large part to Bush-Cheney energy policies, both men having permanent connections to the energy industries) while Germany and Japan increased theirs dramatically. In the meantime, the World Coal Association boasted about the increased use of the smoky black lumps of carbon. The top five countries in coal consumption, in order, are China, United States, India, Japan, and South Africa.
     If all these statistics suggest that many of the same countries are responsible for the use of most of the polluting and nonrenewable energy sources, then some people have argued that the best solution is a return to overwhelming dependency on nuclear power. As the Economist reported in 2009:
"It is hard to know the true cost of a modern nuclear plant. Most Western reactors that are still running were built years ago (Britain’s newest, Sizewell B, is 14 years old). Two new reactors of the type Britain may choose are being constructed in Finland and France. Discouragingly, the Finnish reactor, originally priced at €3 billion (£2.1 billion at the time), is three years late and around €2 billion more expensive than expected. The French plant is also thought to be over budget, by around 20%."
     Of course, the economics of compound interest can make anything appear to be financially promising. The real issue, some people warn, is that no matter what safeguards are in place, the opportunity for a major accident and meltdown is always present. The other clear issue is the storage of toxic radioactive waste. The waste has to be stored for thousands of years. The same magazine points out that the economic issue can be addressed by carbon taxing: “Nuclear energy’s best hope lies in carbon pricing, which forces fossil-fuel plants to pay for the environmental cost of the carbon they generate.” On the issue of safety, however, the publication is silent.
     That is not the case with the Union of Concerned Scientists. In 2007 they wrote: “The United States has strong safety regulations on the books, but the Nuclear Regulatory Commission does not enforce them consistently. Current security standards are inadequate to defend nuclear plants against terrorist attacks. A major accident or successful attack could kill thousands of people and contaminate large regions for thousands of years.”
     Ah, but as the makers of detergents like to say, clean-ups are a breeze. Well, not really. Reprocessing of nuclear fuel seemed for about ten minutes to be the answer to waste disposal. However, it was only tried in one place and did not exactly result in a world-class success. West Valley, New York was the site of the first and only commercial reprocessing plant in the United States. After beginning operations in 1966 with a theoretical capacity to reprocess 300 metric tons of spent nuclear fuel per year, the facility reprocessed a total of 640 tons of waste in six years (far below expectations) before shutting down in 1972. In that time, it transformed West Valley into a radioactive waste site, ultimately accumulating over 600,000 gallons of high-level waste in onsite storage tanks. After years of delay, legal disputes, and waste treatment and billions of dollars in federal expenditures, stabiliza­tion of the high-level waste under the West Valley Demonstration Project (WVDP) was completed in 2002, but all of it remains onsite. Cleanup of reprocessing activities at the site, including “low-level” waste removal and decontamination, is expected to take forty years and cost over $5 billion. But that is all in the past, right?
     Nope. For six years, workers processed nuclear waste at the plant outside Buffalo. In its short life, the West Valley Demonstration Project polluted soil, air and water, and sickened employees. Four decades later, hundreds of cleanup workers are still at the site decontaminating buildings that will eventually be torn down. As of December 2010, workers are preparing to install a massive underground wall designed to stop the spread of a radioactive plume that threatens the region’s groundwater. The Department of Energy estimates the ultimate cost of the cleanup to be in excess of five billion dollars.
     The other major nuclear issue of our time concerns the ugly matter of nuclear weapons. Even though the so-called Cold War is presumed to be over, a lot of nuclear weapons are still hanging around, waiting to be used or dismantled. The United States leads the list of countries with both strategic and nonstrategic nuclear weapons, modestly claiming 10,500-12,000 such devices. Russia comes in second with 10,000. France hits third place with 464, followed by China at 410, Israel with 200, and India with approximately 60.
     We have seen the effects of nuclear bombs at the various tests throughout the world and of course at Hiroshima and Nagasaki. But that was years ago! What about today? What are the effects of nuclear weapons right now? Of course, the answer depends upon the size of the bomb, but in general, certain factors are consistent. The energy of a nuclear explosion is transferred to the surrounding medium in three distinct forms: blast, thermal radiation, and nuclear radiation. The distribution of energy among these three forms will depend on the yield of the weapon, the location of the burst, and the characteristics of the environment. But for a low altitude atmospheric detonation of a moderate sized weapon in the kiloton range, the energy is distributed roughly as follows:

  • 50% as blast 
  • 35% as thermal radiation, made up of a wide range of the electromagnetic spectrum, including infrared, visible, and ultraviolet light and some soft x-ray emitted at the time of the explosion; and 
  • 15% as nuclear radiation, including 5% as initial ionizing radiation consisting chiefly of neutrons and gamma rays emitted within the first minute after detonation, and 
  • 10% as residual nuclear radiation. Residual nuclear radiation is the hazard in fallout. 

     Because of the tremendous amounts of energy liberated per unit mass in a nuclear detonation, temperatures of several tens of millions degrees centigrade develop in the immediate area of the detonation. This is in marked contrast to the few thousand degrees of a conventional explosion. At these very high temperatures the nonfissioned parts of the nuclear weapon are vaporized. The atoms do not release the energy as kinetic energy but release it in the form of large amounts of electromagnetic radiation. In an atmospheric detonation, this electromagnetic radiation, consisting chiefly of soft x-ray, is absorbed within a few meters of the point of detonation by the surrounding atmosphere, heating it to extremely high temperatures and forming a hot sphere of air and gaseous weapon residues, otherwise known as the fireball. Immediately upon formation, the fireball begins to grow rapidly and rises like a hot air balloon. Within a millisecond after detonation, the diameter of the fireball from a 1 megaton air burst is 150 meters. This increases to a maximum of 2200 meters within ten seconds, at which time the fireball is also rising at the rate of 100 meters per second. The initial rapid expansion of the fireball severely compresses the surrounding atmosphere, producing a powerful blast wave.
     As it expands toward its maximum diameter, the fireball cools, and after about a minute its temperature has decreased to such an extent that it no longer emits significant amounts of thermal radiation. The combination of the upward movement and the cooling of the fireball gives rise to the formation of the characteristic mushroom-shaped cloud. As the fireball cools, the vaporized materials in it condense to form a cloud of solid particles. Following an air burst, condensed droplets of water give it a typical white cloudlike appearance. In the case of a surface burst, this cloud will also contain large quantities of dirt and other debris which are vaporized when the fireball touches the earth’s surface or are sucked up by the strong updrafts afterwards, giving the cloud a dirty brown appearance. The dirt and debris become contaminated with the radioisotopes generated by the explosion or activated by neutron radiation and fall to earth as fallout.

     But the Earth does not have to be a repository for an unending series of toxic waste barrels and fallout shelters protecting the few survivors from thousands of years of radioactive poisoning. Solar energy is a viable and attractive alternative.
     It was during the latter half of the 1950’s that solar power saw its first mainstream usage. The first solar water heated office building was built during this time by an architect named Frank Bridgers. A short time later a small satellite of the US Vanguard was powered by a solar cell of less than one watt.
     After such big strides in the 1950’s, one would assume that solar power really took off, but oil prices then held back an even more mainstream usage of solar power. In the 1960’s the oil prices were so cheap that it was more affordable for people to power their homes with oil than it was to power their homes or offices with solar energy.
     Solar power saw a rebirth in the 1970’s with the OPEC oil embargo. This was a great opportunity to utilize solar power. As a matter of fact, the US Department of Energy financed the Federal Photovoltaic Utilization Program. This program was responsible for the installation and testing of over 3,000 photovoltaic systems.
     The 1990’s brought an even more mainstream interest in solar power. The Gulf War once again made many take note of where we get oil and had some worried about our dependence on foreign countries for our energy resources. Solar power was seen as a great alternative to oil and petroleum products. During the 1990’s over one million homes had some form of solar power installed.
     But the Bush administration in the United States did away with the overwhelming majority of industry incentives to further develop solar power. And so today solar energy is used in only two different manners. First is the photovoltaic conversion format, which most people call solar panels. These panels are used to create electricity directly from the sun. They can be used alone or in conjunction with other power resources. The second type of solar power that is used today is thermal solar power, which is where the sun is used to heat fluids which then powers turbines or other types of machinery.
     While solar power is more commonly used today than any other time in history, the fundamentals are about the same as they have always been. The power of the sun is used to heat liquids just as it was used to heat space in ancient times. The photovoltaic technology has been updated so that the panels are thin and smaller, but the technology is basically the same. The reason for this is that when the sun is over head, an acre of land receives four thousand horse power of power at any time. The sun always has been, and always will be, a tremendous source of power, which leaves no question that with the improvement of technology, our ability to harness this power will only become greater and more widespread in its use. That is, unless governments and industries conspire to prevent it, a sobering possibility, given the long history of just this type of collusion.