Sunday, December 2, 2012

Manhattan Project


This article is about the atomic bomb project.
A fiery mushroom cloud lights up the sky.
The Manhattan Project was a research and development program by the United States with the United Kingdom and Canada that produced the first atomic bomb during World War II. From 1942 to 1946, the project was under the direction of Major General Leslie Groves of the US Army Corps of Engineers. The Army component of the project was designated the Manhattan District; "Manhattan" gradually superseded the official codename, "Development of Substitute Materials", for the entire project. Along the way, the Manhattan Project absorbed its earlier British counterpart, Tube Alloys.
The Manhattan Project began modestly in 1939, but grew to employ more than 130,000 people and cost nearly US$2 billion (roughly equivalent to $25.8 billion as of 2012[1]). Over 90% of the cost was for building factories and producing the fissionable materials, with less than 10% for development and production of the weapons. Research and production took place at more than 30 sites, some secret, across the United States, the United Kingdom and Canada. Two types of atomic bomb were developed during the war. A relatively simple gun-type fission weapon was made using uranium-235, an isotope that makes up only 0.7 percent of natural uranium. Since it is chemically identical to the main isotope, uranium-238, and has almost the same mass, it proved difficult to separate. Three methods were employed for uranium enrichmentelectromagneticgaseous and thermal. Most of this work was performed at Oak Ridge, Tennessee.
In parallel with the work on uranium was an effort to produce plutonium. Reactors were constructed at Hanford, Washington, in which uranium was irradiated and transmuted into plutonium. The plutonium was then chemically separated from the uranium. The gun-type design proved impractical to use with plutonium so a more complex implosion-type weapon was developed in a concerted design and construction effort at the project's weapons research and design laboratory in Los Alamos, New Mexico. The first nuclear device ever detonated was an implosion-type bomb at the Trinity test, conducted at New Mexico's Alamogordo Bombing and Gunnery Range on 16 July 1945. Little Boy, a gun-type weapon, and the implosion-type Fat Man were used in the atomic bombings of Hiroshima and Nagasaki, respectively.
The Manhattan Project operated under a blanket of tight security, but Soviet atomic spies still penetrated the program. It was also charged with gathering intelligence on the German nuclear energy project. Through Operation Alsos, Manhattan Project personnel served in Europe, sometimes behind enemy lines, where they gathered nuclear materials and rounded up German scientists. In the immediate postwar years the Manhattan Project conducted weapons testing at Bikini Atoll as part of Operation Crossroads, developed new weapons, promoted the development of the network of national laboratories, supported medical research into radiology and laid the foundations for the nuclear navy. It maintained control over American atomic weapons research and production until the formation of the United States Atomic Energy Commission in January 1947.

Origins

In August 1939, prominent physicists Leó Szilárd and Eugene Wigner drafted the Einstein–Szilárd letter, which warned of the potential development of "extremely powerful bombs of a new type". It urged the United States to take steps to acquire stockpiles of uranium ore and accelerate the research of Enrico Fermi and others into nuclear chain reactions. They had it signed by Albert Einstein and delivered to President Franklin D. Roosevelt. Roosevelt called on Lyman Briggs of the National Bureau of Standards to head the Advisory Committee on Uranium to investigate the issues raised by the letter. Briggs held a meeting on 21 October 1939, which was attended by Szilárd, Wigner and Edward Teller. The committee reported back to Roosevelt in November that uranium "would provide a possible source of bombs with a destructiveness vastly greater than anything now known."[2]
Briggs proposed that the National Defense Research Committee (NDRC) spend $167,000 on research into uranium, particularly the uranium-235 isotope, and the recently discovered plutonium.[3] On 28 June 1941, Roosevelt signed Executive Order 8807, which created the Office of Scientific Research and Development (OSRD),[4] with Vannevar Bush as its director. The office was empowered to engage in large engineering projects in addition to research.[3] The NDRC Committee on Uranium became the S-1 Uranium Committee of the OSRD; the word "uranium" was soon dropped for security reasons.[5]
In Britain, Otto Frisch and Rudolf Peierls at the University of Birmingham had made a breakthrough investigating the critical mass of uranium-235 in June 1939.[6] Their calculations indicated that it was within an order of magnitude of 10 kilograms (22 lb), which was small enough to be carried by a bomber of the day.[7] Their March 1940 Frisch–Peierls memorandum initiated the British atomic bomb project and its Maud Committee,[8] which unanimously recommended pursuing the development of an atomic bomb.[7] One of its members, the Australian physicist Mark Oliphant, flew to the United States in late August 1941 and discovered that data provided by the Maud Committee had not reached key American physicists. Oliphant then set out to find out why the committee's findings were apparently being ignored. He met with the Uranium Committee, and visitedBerkeley, California, where he spoke persuasively to Ernest O. Lawrence. Lawrence was sufficiently impressed to commence his own research into uranium. He in turn spoke to James B. ConantArthur Compton and George Pegram. Oliphant's mission was therefore a success; key American physicists were now aware of the potential power of an atomic bomb.[9][10]
At a meeting between President Roosevelt, Vannevar Bush and Vice President Henry A. Wallace on 9 October 1941, the President approved the atomic program. To control it, he created a Top Policy Group consisting of himself—although he never attended a meeting—Wallace, Bush, Conant, Secretary of War Henry L. Stimson and the Chief of Staff of the ArmyGeneral George Marshall. Roosevelt chose the Army to run the project rather than the Navy, as the Army had the most experience with management of large-scale construction projects. He also agreed to coordinate the effort with that of the British, and on 11 October he sent a message to Prime Minister Winston Churchill, suggesting that they correspond on atomic matters.[11]

Feasibility

Proposals

The S-1 Committee held its first meeting on 18 December 1941 "pervaded by an atmosphere of enthusiasm and urgency"[12] in the wake of the attack on Pearl Harbor and the subsequent declaration of war by the United States on Japan and Germany. Work was proceeding on three different techniques for isotope separation to separate uranium-235 from uranium-238. Lawrence and his team at the University of California, Berkeley, investigated electromagnetic separation, while Eger Murphree and Jesse Wakefield Beams's team looked into gaseous diffusion at Columbia University, and Philip Abelson directed research into thermal diffusion at the Carnegie Institution of Washington and later the Naval Research Laboratory.[13] Murphree was also the head of an unsuccessful separation project using centrifuges.[14]
Meanwhile, there were two lines of research into nuclear reactor technology, with Harold Urey continuing research into heavy water at Columbia, while Arthur Compton brought the scientists working under his supervision at Columbia University and Princeton University to the University of Chicago, where he organized the Metallurgical Laboratory in early 1942 to study plutonium and reactors using graphite as a neutron moderator.[15] Briggs, Compton, Lawrence, Murphree and Urey met on 23 May 1942 to finalize the S-1 Committee recommendations, which called for all five technologies to be pursued. This was approved by Bush, Conant and Brigadier General Wilhelm D. Styer, the chief of staff of Major General Brehon B. Somervell's Services of Supply, who had been designated the Army's representative on nuclear matters.[13] Bush and Conant then took the recommendation to the Top Policy Group with a budget proposal for $54 million for construction by the United States Army Corps of Engineers, $31 million for research and development by OSRD and $5 million for contingencies in fiscal year 1943. The Top Policy Group in turn sent it to the President on 17 June 1942 and he approved it by writing "OK FDR" on the document.[13]

Bomb design concepts

Diagram showing fast explosive, slow explosive, uranium tamper, plutonium core and neutron initiator.
Compton asked the theoretical physicist J. Robert Oppenheimer of the University of California, Berkeley, to take over research into fast neutron calculations—the key to calculations of critical mass and weapon detonation—fromGregory Breit, who had quit on 18 May 1942 because of concerns over lax operational security.[16] John H. Manley, a physicist at the Metallurgical Laboratory, was assigned to assist Oppenheimer by contacting and coordinating experimental physics groups scattered across the country.[17] Oppenheimer and Robert Serber of the University of Illinois examined the problems of neutron diffusion—how neutrons moved in a nuclear chain reaction—andhydrodynamics—how the explosion produced by a chain reaction might behave. To review this work and the general theory of fission reactions, Oppenheimer convened meetings at the University of Chicago in June and at the University of California, Berkeley, in July 1942 with theoretical physicists Hans BetheJohn Van VleckEdward TellerEmil Konopinski, Robert Serber, Stan Frankel, and Eldred C. Nelson, the latter three former students of Oppenheimer, and experimental physicistsFelix BlochEmilio Segrè, John Manley and Edwin McMillan. They tentatively confirmed that a fission bomb was theoretically possible.[18]
There were still many unknown factors. The properties of pure uranium-235 were relatively unknown, as were those of plutonium, an element that had only been discovered in February 1941 by Glenn Seaborg and his team. The scientists at the Berkeley conference envisioned creating plutonium in nuclear reactors where uranium-238 atoms absorbed neutrons that had been emitted from fissioning uranium-235 atoms. At this point no reactor had been built, and only tiny quantities of plutonium were available from cyclotrons.[19] Even by December 1943, only two milligrams had been produced.[20] There were many ways of arranging the fissile material into a critical mass. The simplest was shooting a "cylindrical plug" into a sphere of "active material" with a "tamper"—dense material that would focus neutrons inward and keep the reacting mass together to increase its efficiency.[21] They also explored designs involving spheroids, a primitive form of "implosion" suggested by Richard C. Tolman, and the possibility of autocatalytic methods, which would increase the efficiency of the bomb as it exploded.[22]
Considering the idea of the fission bomb theoretically settled—at least until more experimental data was available—the Berkeley conference then turned in a different direction. Edward Teller pushed for discussion of a more powerful bomb: the "super", now usually referred to as a "hydrogen bomb", which would use the explosive force of a detonating fission bomb to ignite a nuclear fusion reaction in deuterium and tritium.[23] Teller proposed scheme after scheme, but Bethe refused each one. The fusion idea was put aside to concentrate on producing fission bombs.[24] Teller also raised the speculative possibility that an atomic bomb might "ignite" the atmosphere because of a hypothetical fusion reaction of nitrogen nuclei.[note 1] Bethe calculated that it could not happen,[26] and a report co-authored by Teller showed that "no self-propagating chain of nuclear reactions is likely to be started."[27] In Serber's account, Oppenheimer mentioned it to Arthur Compton, who "didn't have enough sense to shut up about it. It somehow got into a document that went to Washington" and was "never laid to rest".[note 2]

Uranium

Ore

The key raw material for the project was uranium, which was used as fuel for the reactors, as feed that was transformed into plutonium, and, in its enriched form, in the atomic bomb itself. There were four known major deposits of uranium in 1940: in Colorado, in northern Canada, inJoachimstal in Czechoslovakia, and in the Belgian Congo.[106] All but Joachimstal were in allied hands. A November 1942 survey determined that sufficient quantities of uranium were available to satisfy the project's requirements.[107] Nichols arranged with the State Department for export controls to be placed on uranium oxide and negotiated for the purchase of 1,200 long tons (1,200 t) of uranium ore from the Belgian Congo that was being stored in a warehouse on Staten Island. He negotiated with Eldorado Gold Mines for the purchase of ore from its mine in Port Hope, Ontario, and its shipment in 100-ton lots. The Canadian government subsequently bought up the company's stock until it acquired a controlling interest.[108]
The richest source of ore was the Shinkolobwe mine in the Belgian Congo, but it was flooded and closed. Nichols unsuccessfully attempted to negotiate its reopening with Edgar Sengier, the director of the company that owned the mine, Union Minière du Haut Katanga.[109] The matter was then taken up by the Combined Policy Committee. As 30 percent of Union Minière's stock was controlled by British interests, the British took the lead in negotiations. Sir John Anderson and Ambassador John Winant hammered out a deal with Sengier and the Belgian government in May 1944 for the mine to be reopened and 1,720 long tons (1,750 t) of ore to be supplied.[110] To avoid dependence on the British and Canadians for ore, Groves also arranged for the purchase of US Vanadium Corporation's stockpile in Uravan, ColoradoUranium mining in Colorado yielded about 800 long tons (810 t) of ore.[111]
Mallinckrodt Incorporated in St. Louis, Missouri, took the raw ore and dissolved it in nitric acid to produce uranyl nitrateEther was then added in a liquid-liquid extraction process to separate the impurities from the uranyl nitrate. This was then heated to form uranium trioxide, which was reduced to highly pure uranium dioxide.[112] By July 1942, Mallinckrodt was producing a ton of highly pure oxide a day, but turning this into uranium metal initially proved more difficult for contractors Westinghouse and Metal Hydrides.[113] Production was too slow and quality was unacceptably low. A special branch of the Metallurgical Laboratory was established at Iowa State College in Ames, Iowa, under Frank Spedding to investigate alternatives, and its Ames process became available in 1943.[114]

Open flanged cylinder with stuff coating the sides and bottom

Weapon design


In 1943, development efforts were directed to a gun-type fission weapon with plutonium called Thin Man. Initial research on the properties of plutonium was done using cyclotron-generated plutonium-239, which was extremely pure, but could only be created in very small amounts. Los Alamos received the first sample of plutonium from the Clinton X-10 reactor in April 1944 and within days Emilio Segrè discovered a problem: the reactor-bred plutonium had a higher concentration of plutonium-240, resulting in up to five times the spontaneous fission rate of cyclotron plutonium.[179] Seaborg had correctly predicted in March 1943 that some of the plutonium-239 would absorb a neutron and become plutonium-240.[180]
This made reactor plutonium unsuitable for use in a gun-type weapon. The plutonium-240 would start the chain reaction too quickly, causing a predetonation that would release enough energy to disperse the critical mass with a minimal amount of plutonium reacted (a fizzle). A faster gun was suggested but found to be impractical. The possibility of separating the isotopes was considered and rejected, as plutonium-240 is even harder to separate from plutonium-239 than uranium-235 from uranium-238.[181]
Work on an alternative method of bomb design, known as implosion, had begun earlier at the instigation of the physicist Seth Neddermeyer. Implosion used explosives to crush a subcritical sphere of fissile material into a smaller and denser form. When the fissile atoms are packed closer together, the rate of neutron capture increases, and the mass becomes a critical mass. The metal needs to travel only a very short distance, so the critical mass is be assembled in much less time than it would take with the gun method.[182] Neddermeyer's 1943 and early 1944 investigations into implosion showed promise, but also made it clear that the problem would be much more difficult from a theoretical and engineering perspective than the gun design.[183] In September 1943, John von Neumann, who had experience with shaped charges used in armor-piercing shells, argued that not only would implosion reduce the danger of predetonation and fizzle, but would make more efficient use of the fissionable material.[184] He proposed using a spherical configuration instead of the cylindrical one that Neddermeyer was working on.[185]

By July 1944, Oppenheimer had concluded plutonium could not be used in a gun design, and opted for implosion. The accelerated effort on an implosion design, codenamed Fat Man, began in August 1944 when Oppenheimer implemented a sweeping reorganization of the Los Alamos laboratory to focus on implosion.[186] Two new groups were created at Los Alamos to develop the implosion weapon, X (for explosives) Division headed by George Kistiakowsky and G (for gadget) Division under Robert Bacher.[187][188] The new design that von Neumann and T (for theoretical) Division, most notably Rudolf Peierls, had devised used explosive lenses to focus the explosion onto a spherical shape using a combination of both slow and fast high explosives.[189]
The design of lenses that detonated with the proper shape and velocity turned out to be slow, difficult and frustrating.[189] Various explosives were tested before settling on composition B as the fast explosive and baratol as the slow explosive.[190] The final design resembled a soccer ball, with 20 hexagonal and 12 pentagonal lenses, each weighing about 80 pounds (36 kg). Getting the detonation just right required fast, reliable and safe electrical detonators, of which there were two for each lens for reliability.[191] It was therefore decided to use exploding-bridgewire detonators. A contract for their manufacture was given to Raytheon.[192] To study the behavior of converging shock waves, Serber devised the RaLa Experiment, which used the short-lived radioisotope lanthanum-140, a potent source of gamma radiation, in an ionization chamber.[193]
Within the explosives was the 4.5-inch (110 mm) thick aluminum pusher, which provided a smooth transition from the relatively low density explosive to the next layer, the 3-inch (76 mm) thick tamper of natural uranium. Its main job was to hold the critical mass together as long as possible. It would also reflect neutrons back into the core. Some part of it might fission as well. To prevent predetonation by an external neutron, the tamper was coated in a thin layer of boron.[191] A polonium-beryllium modulated neutron initiator, known as an "urchin" because its shape resembled a sea urchin,[194] was developed by the Monsanto Company to start the chain reaction at precisely the right moment.[195]This work with the chemistry and metallurgy of radioactive polonium was directed by Charles Allen Thomas and became known as the Dayton Project.[196] Testing required up to 500 curies per month of polonium, which Monsanto was able to deliver.[197] The whole assembly was encased in a duralumin bomb casing to protect it from bullets and flak.[191]

The ultimate task of the metallurgists was to determine how to cast plutonium into a sphere. The difficulties became apparent when attempts to measure the density of plutonium gave inconsistent results. At first contamination was believed to be the cause, but it was soon determined that there were multiple allotropes of plutonium.[198] The brittle α phase that exists at room temperature changes to the plastic β phase at higher temperatures. Attention then shifted to the even more malleable δ phase that normally exists in the 300 °C to 450 °C range. It was found that this was stable at room temperature when alloyed with aluminum, but aluminum emits neutrons when bombarded with alpha particles, which would exacerbate the pre-ignition problem. The metallurgists then hit upon a plutonium-gallium alloy, which stabilized the δ phase and could be hot pressed into the desired spherical shape. As plutonium was found to corrode readily, the sphere was coated with nickel.[199]
The work proved dangerous. By the end of the war, half the experienced chemists and metallurgists had to be removed from work with plutonium when unacceptably high levels of the element appeared in their urine.[200] A minor fire at Los Alamos in January 1945 led to a fear that a fire in the plutonium laboratory might contaminate the whole town, and Groves authorized the construction of a new facility for plutonium chemistry and metallurgy, which became known as the DP-site.[201] The hemispheres for the first plutonium pit (or core) were produced and delivered on 2 July 1945. Three more hemispheres followed on 23 July and were delivered three days later.[

to read more,,,,,,
http://en.wikipedia.org/wiki/Manhattan_Project




Big Brother (Nineteen Eighty-Four)


Big Brother is a fictional character in George Orwell's novel Nineteen Eighty-Four. He is the enigmatic dictator of Oceania, a totalitarian state taken to its utmost logical consequence – where the ruling Party wields total power for its own sake over the inhabitants.
In the society that Orwell describes, everyone is under complete surveillance by the authorities, mainly by telescreens. The people are constantly reminded of this by the phrase "Big Brother is watching you", which is the core "truth" of the propaganda system in this state.
Since the publication of Nineteen Eighty-Four, the term "Big Brother" has entered the lexicon as a synonym for abuse of government power, particularly in respect to civil liberties, often specifically related to mass surveillance.

Purported origins


In the essay section of his novel 1985Anthony Burgess states that Orwell got the idea for Big Brother from advertising billboards for educational correspondence courses from a company called Bennett's, current during World War II. The original posters showed Bennett himself; a kindly looking old man offering guidance and support to would-be students with the phrase "Let me be your father" attached. After Bennett's death his son took over the company, and the posters were replaced with pictures of the son (who looked imposing and stern in contrast to his father's kindly demeanour) with the text "Let me be your big brother."
As well as Bennett, speculation has also focused on Lord Kitchener,[1] who among other things was prominently involved in British military recruitment in World War I. As a child Orwell (under his real name Eric Blair) published poems praising Kitchener and war recruitment in his local newspaper.
Additional speculation from Douglas Kellner of UCLA argued that Big Brother represents Joseph Stalin and that the novel portrayed life under totalitarianism.[2]

[edit]Appearance in the novel


Existence

In the novel it is not clear whether Big Brother is (or was) a real person or a fiction invented by the Party to personify itself.
In Party propaganda Big Brother is presented as a real person: one of the founders of the Party, along with Goldstein. At one point in 1984 Winston Smith, the protagonist of Orwell's novel, tries "to remember in what year he had first heard mention of Big Brother. He thought it must have been at some time in the sixties, but it was impossible to be certain. In the Party histories, of course, Big Brother figured as the leader and guardian of the Revolution since its very earliest days. His exploits had been gradually pushed backwards in time until already they extended into the fabulous world of the forties and the thirties, when the capitalists in their strange cylindrical hats still rode through the streets of London..." In the year 1984 Big Brother appears on posters and the telescreen as a man of about 45. Goldstein's book comments: "We may be reasonably sure that he will never die, and there is already considerable uncertainty as to when he was born."
When Winston Smith is later arrested, O'Brien, his interrogator, again describes Big Brother as a figure who will never die. When Smith asks if Big Brother exists, O'Brien describes him as "the embodiment of the Party" and that he will exist as long as the Party exists. When Winston follows up his question by asking "Does Big Brother exist the same way I do?", O'Brien replies "You do not exist."

[edit]Cult of personality

A spontaneous ritual of devotion to Big Brother ("BB") is illustrated at the end of the "Two Minutes Hate":
At this moment the entire group of people broke into a deep, slow, rhythmic chant of 'B-B! .... B-B! .... B-B!'—over and over again, very slowly, with a long pause between the first 'B' and the second—a heavy murmurous sound, somehow curiously savage, in the background of which one seemed to hear the stamps of naked feet and the throbbing of tom-toms. For perhaps as much as thirty seconds they kept it up. It was a refrain that was often heard in moments of overwhelming emotion. Partly it was a sort of hymn to the wisdom and majesty of Big Brother, but still more it was an act of self-hypnosis, a deliberate drowning of consciousness by means of rhythmic noise.[3]
Though Oceania's Ministry of TruthMinistry of Plenty, and Ministry of Peace each have names with meanings deliberately opposite to their real purpose, the Ministry of Love is perhaps the most straightforward: "rehabilitated thought criminals" leave the Ministry as loyal subjects who have been brainwashed into genuinely loving Big Brother.

[edit]Legacy

Since the publication of Nineteen Eighty-Four the phrase "Big Brother" has come into common use to describe any prying or overly-controlling authority figure, and attempts by government to increase surveillance.
Ukrainian-American comedian Yakov Smirnoff makes frequent reference to both Big Brother and other Orwellian traits in his Russian Reversal jokes.
The magazine Book ranked Big Brother No. 59 on its 100 Best Characters in Fiction Since 1900 list. Wizard magazine rated him the 75th greatest villain of all time.[4]
The worldwide reality television show Big Brother is based on the novel's concept of people being under constant surveillance. In 2000, after the U.S. version of the CBS program "Big Brother" premiered, the Estate of George Orwell sued CBS and its production company "Orwell Productions, Inc." in federal court in Chicago for copyright and trademark infringement. The case was Estate of Orwell v. CBS, 00-c-5034 (ND Ill). On the eve of trial, the case settled worldwide to the parties' "mutual satisfaction"; the amount that CBS paid to the Orwell Estate was not disclosed. CBS had not asked the Estate for permission. Under current laws the novel will remain under copyright protection until 2020 in the European Union and until 2044 in the United States.
The iconic image of Big Brother (played by David Graham) played a key role in Apple's 1984 television commercial introducing the Macintosh.[5][6] The Orwell Estate viewed the Apple commercial as a copyright infringement, and sent a cease-and-desist letter to Apple and its advertising agency. The commercial was never televised again.[7]

The December 2002 issue of Gear magazine featured a story about technologies and trends that could violate personal privacy moving society closer to a "Big Brother" state and utilised a recreation of the movie poster from the film version of 1984 created by Dallmeierart.com.[8]
In 2011, media analyst and political activist Mark Dice published a non-fiction book titled Big Brother: The Orwellian Nightmare Come True which analyses the parallels between elements of the storyline in Nineteen Eighty-Four, and current government programs, technology, and cultural trends.[9]
Computer company Microsoft patented in 2011 a product distribution system with a camera or capture device that monitors the viewers that consume the product, allowing the provider to take "remedial action" if the actual viewers do not match the distribution license.[10] The system has been compared with 1984's telescreen surveillance system.[11]

See also



Saturday, December 1, 2012

Frankenfish and the World of Genetically Modified Food

Frankenfish and the World of Genetically Modified Food


The scary ways our food supply is being screwed with and why you might not know you're eating altered fish


By Gretchen Voss, Photography By Dan Forbes
 
 
Frankenfish and the World of Genetically Modified Food The scary ways our food supply is being screwed with and why you might not know you're eating altered fish
 
Really, who among us has looked at a salmon glistening in the grocery store and wondered, Where did you come from?

If you've thought about it at all, you've probably assumed it's a matter of the birds and the bees: Girl fish meets boy fish, they hit it off, and he showers her thousands of eggs with his sperm. A little while later their offspring are hatched, grow to 10 pounds in about three years, and land on our grills.

Now, however, a biotechnology company near Boston wants to change all that. AquaBounty Technologies (ABT), using some snazzy feats of engineering and genetic code manipulations, has developed a fish—called AquAdvantage Salmon—that grows twice as fast as conventional ones. Pending FDA approval, ABT could be the first company in the world to market a genetically engineered animal as food.

In other words, the birds and the bees are about to get a turbo boost. On Prince Edward Island, in a secure facility surrounded by an eight-foot-high chain-link fence and monitored by video cameras and guards, ABT has concocted a "product" (salmon eggs) that it calls a triploid hemizygous all-female Atlantic salmon. Though the president of ABT, Ronald L. Stotish, describes the decidedly unsexy reproductive process as "a very simple thing," it is anything but. Because the growth hormones of regular Atlantic salmon are turned on only about three months out of the year, ABT created a genetic cocktail that makes the fish grow continuously. The company's scientists did this via a new gene construct that combines the growth hormones of a Chinook salmon with a regulator gene from an eel-like fish called an ocean pout. Then they injected the new gene into Atlantic-salmon eggs. The resulting fish reach market size in 18 months, half the usual time—so they can land on our grills faster.

The company claims that the new salmon, with its "shorter production cycles and increased efficiency of production," is one biotechnology answer to a global food problem. That it will provide more food for a hungry world by advancing the already $86 billion farmed-fish business, the fastest-growing segment of the seafood industry worldwide. And that it's perfectly safe for humans to consume. At least, ABT claims, there's no proof that screwing with an animal's genome makes it unsafe to eat.

So far, the FDA agrees. After reviewing and accepting the studies submitted by ABT this past fall, the agency is now considering whether to approve its application. If it does, these new fish could make their way onto our plates in the next few years.

But not without swimming in a sea of Frankenfish controversy. Critics are incensed that the FDA is regulating this transgenic animal food as an animal drug—since the DNA construct inserted into salmon eggs fits the definition of a drug—which means it's not undergoing the standard evaluations that foods typically do. They are concerned about dangerous, unanticipated health problems for humans who eat this food, and worried that an approval will pave the way for the other genetically engineered animals currently in the research-and-development pipeline. And what has them really burning: Consumers won't have the freedom to accept or reject this altered fish, since it most likely won't be labeled as such.

It may seem like a scary new Orwellian world, but futuristic biotechnology has already been replacing traditional agriculture, without the consent of consumers. Until now, however, that technology was confined to the genes of plants we eat. Not animals.

Tampering with Nature
It was in the 1970s that scientists discovered they could transfer genes from the DNA of one species into that of a wholly different one—that they could, in essence, monkey with Mother Nature and bust through the natural barriers created by millions of years of evolution. In the wake of this breakthrough, genetic engineers began creating artificial gene combinations by splicing genetic material from bacteria, viruses, and other organisms and forcibly injecting them into plant genomes to create novel traits.

At the time, the big hope of agricultural biotechnology was to create more nutritious food and more productive crops to feed a hungry world. In reality, what the companies developed were either crops with built-in pesticides (for example, Bt corn, which has genes from a soil bacterium inserted into its DNA so that every cell of the plant produces pesticidal toxins) or herbicide tolerance (such as Roundup Ready soybeans, which are engineered to withstand Roundup weed killer). In other words, the focus wasn't on cultivating food that was better for us, but on increasing profits for these companies.

By the 1990s, companies such as Monsanto, the developer of Roundup Ready seeds, were positioned to bring their products to market, and the U.S. government needed to devise a game plan to oversee these newfangled food crops. "The government decided that biotechnology was the technology of the 21st century and the U.S. had to lead the world," says Bill Freese, a science policy analyst with the Center for Food Safety, a nonprofit for public interest and environmental advocacy. "As part of that commitment, they decided on a very lax regulatory system."

That's because the FDA's 1992 policy on genetically modified (GM) foods was based on this premise: The foods derived from these new methods weren't different from other foods in any meaningful way—or at least, the FDA said, it wasn't aware of any information that indicated they were different. And so no safety studies were necessary, no labels were required, and ultimately, the food producers were responsible for policing themselves.

The regulatory process, says Freese, is a glorified rubber stamp. "It's totally unlike the drug-approval process, in which the FDA actually takes responsibility," he says.

"This is a radical new technology. We need very good, careful, close regulation, and we just don't have that. We can't be assured of the safety of any of these genetically engineered organisms."

In fact, contrary to popular belief, the FDA doesn't have the authority to require that companies seek an approval for GM crops. It provides only voluntary consultations. "So GM varieties that have never been fed to animals in rigorous safety studies, let alone to humans, are approved for sale in grocery stores," says Jeffrey M. Smith, author of Seeds of Deception.

And we have been chowing down on mass quantities of the stuff since the mid 1990s, without any labels to give us pause.

Indeed, as much as 80 percent of the processed foods filling the supermarket aisles and overflowing our shopping carts—including infant formula, salad dressing, bread, cereal, crackers, cookies, veggie burgers, frozen yogurt, protein powder, and alcoholic beverages—contain an engineered ingredient, mostly from soy, canola, and corn.

Playing Genetic Roulette
But now the concern that these genetically manipulated foodstuffs are harming human health is growing. Inserting a gene into a plant's genome is a random and haphazard process that allows no control over where the gene actually ends up in the plant's otherwise carefully constructed DNA. Insertions can show up inside other genes, can delete natural genes or permanently turn them on or off, and can cause significant mutations near the insertion site. For instance, one study found that a gene known to be a corn allergen was turned on in GM corn, though it was turned off in its conventional parent.

"It's genetic roulette," says Smith. "You can create carcinogens, anti-nutrients, toxins. We don't understand the language of DNA enough to predict what might happen. It's an infant technology, and we're making changes that are permanent in the gene pool of species."

The potential for allergenicity is an acknowledged problem, says Michael Hansen, Ph.D., a senior scientist at Consumer Union, publisher of Consumer Reports. In fact, food allergies doubled from 1997 (the year Bt corn entered our diets) to 2002. The EPA recently gave the University of Chicago a grant to assess whether the pesticides produced in genetically modified plants might be the culprit.

Agricultural biotechnology companies roundly dismiss these fears, saying that millions of Americans have been eating GM foods without ill effect. So if they are making us sick, why don't we know about it?

"The trouble is we don't have controls. We don't know what we're eating because we don't have labeling, so it's all just sort of a crapshoot," says Freese. "The GM corn is mixed in with regular corn, and then it enters the food supply. It's really impossible to trace the impacts." Since neither consumer nor manufacturer actually knows how much GM content is in their food, it's near impossible to investigate properly and expose a link to illness.

If the lab-created foods currently on the market are causing common diseases, we may not be able to identify the source of the problem for decades, if ever. Still, in May 2009, the American Academy of Environmental Medicine, an international association of physicians interested in environmental impacts on human health, called for an immediate moratorium on GM foods. "Several animal studies indicate serious health risks associated with GM food consumption," it stated in a position paper, citing infertility, immune dysregulation, accelerated aging, insulin regulation, and changes in the liver, kidney, and gastrointestinal systems. The group advised physicians to ask their patients to try to avoid GM foods, as "there is more than a casual association between GM foods and adverse health effects."

And yet, just this year, the USDA approved genetically engineered sugar beets (used to make sugar) and alfalfa. Next up: the first animal genetically engineered for food, in the form of one fast-growing salmon.
 
http://www.womenshealthmag.com/health/frankenfish