Artificial life

Predictions revisited

Forecasts and fears about artificial life

Today’s news of Craig Venter’s success in fashioning a new species of bacterium, with completely man-made DNA, propels me to the bookshelf to recall how this breathtaking event was anticipated with some trepidation.

“Fried-egg” colonies of (a) the synthetic bacterium Mycoplasma mycoides JCVI-syn1.0 and (b) the natural Mycoplasma capricolum bacterium that provided the cytoplasm but had its DNA replaced. The blue colour in (a) is created by a protein coded in the new DNA that reacts with a chemical in the agar base. Without that gene the colonies of the unmodified recipient bacterium remain white (b). From Gibson et al. Science, see references.

In brief, as reported online in Science magazine, Venter’s team designed, synthesized and assembled a DNA chromosome. They transplanted it into a bacterium to create novel cells controlled only by the synthetic DNA, and capable of continuous self-replication.

Anticipations of manipulated life

My personal favourite among the projections of genetic engineering has for long been that of the physicist Freeman J. Dyson, with his idea of breeding gigantic trees capable of living on comets, to provide human habitation even in the outer Solar System.

But if you google “genetic engineering” you’ll get 5.4 million offerings, or 2.2 million for “bioethics”. Old books on my shelf have many pages on the future manipulation of life and its possible risks. To be reasonably brief, I’ve picked a possibly useful overview of issues anticipated a quarter of a century ago. It’s from The Green Machines, which was written as if looking back from 2030 and published in 1986.

In Aztec times in Mexico, people made biscuits out of mats of cyanobacteria called Spirulina, which they harvested from alkaline lakes. By the 1970s Spirulina was known to be exceptionally rich in protein, and harvesting resumed in Mexico. Italian microbiologists discov­ered that they could prolong the growing season for Spirulina by cultivating it in polyethylene tubes, and they reported pro­duction rates of 50 tons per hectare per year-roughly ten times the yields of ordinary crops. Doing without sunshine, Imperial Chemical Industries grew bacteria (Methylophilus methylotrophus) on methanol and marketed them as Pruteen.


Few environments were so barren or harmful to life that microorganisms were entirely absent. Bacteria accustomed to living at high temperatures in hot springs were well suited to the production of biogas from animal manure and human sewage in biogas plants running at up to 60°C. Other bacteria thrived among heavy metals that would kill most organisms. They drew sustenance from a range of energy-releasing chemi­cal reactions that seemed strange to animals with more limited tastes. Thiobacillus bacteria, for example, could grow by steal­ing electrons from metal atoms and using them to buy carbon from carbon dioxide. Such microbes were used commercially for winning metals from low-grade copper ores, and experi­mentally for concentrating uranium from sea water.

Bacteria were always the planet’s cleansing agents. Under human management they played a routine part in purifying fresh water contaminated by sewage and industrial wastes. Botho Boehke of Aachen found that purification plants could be made far more effective by excluding large microorganisms that preyed on the bacteria. Bacteria in the sea and the beaches destroyed oil spills, whether from natural leaks or wrecked supertankers. In their cleansing role, bacteria attracted the attention of genetic engineers, with Ananda Chakrabarty of General Electric in the U.S. leading the way.

In the early 1970s, Chakrabarty gathered genetic elements (plasmids) from four different bacteria and combined them in one oil-eating “superbug.” This created a stir because in 1980 it became the first genetically engineered microbe to be patented. In 1981, Chakrabarty created another bacterium that broke down the persistent herbicide 2,4,5-T, which was used as an environmental weapon in the Vietnam War and was re­sponsible for serious pollution in the V.S. and Italy. By 1985, Ken Timmis and his colleagues at the University of Geneva had engineered a bacterium to break down another persistent chemical pollutant, methylchlorophenol.

The chemical repertoire of natural microorganisms, already enormous, could be enlarged almost indefinitely by genetic engineering. Bacteria could, for example, replace plants and animals as the source of valuable chemicals. The first commer­cial process of that kind made rennin in a bacterium. Rennin was an enzyme found naturally in the fourth stomach of a calf, and cheese makers needed it for curdling milk. Genetic engineers of Celltech in England transferred the gene coding for rennin production into the bacterium Escherichia coli, which thereupon manufactured rennin abundantly. After treatment with acid (as in the calf’s stomach) the engineered rennin had the same effect on milk as the natural enzyme.

“Protein engineering” was the tag under which molecular biologists declared their hopes of going beyond the transfer of existing genes from one organism to another, to the creation of novel genes for making useful enzymes and other proteins. This idea was being pushed hard by a protein-engineering “club” of university and industrial scientists in Britain in the mid-1980s. The chief impediment was ignorance ofthe natural rules governing the shapes and behavior of protein molecules; on the other hand, devising novel proteins was a good way to learn. Then the way would be clear to improve the efficiency of natural enzymes and to develop completely new enzymes to carry out tasks prescribed by the chemists, for human pur­poses.

Genetic engineering could also mitigate harmful effects of bacteria. A case in point was Pseudomonas syringae, a bacte­rium that occurred on the leaves of plants. Steve Lindow of the University of California at Berkeley found that it caused frost- damage to the leaves because the bacterium carried a protein on its jacket that made an ideal nucleus for the forma­tion of ice crystals. Lindow and Nicholas Panopulos found the gene responsible for this ice-nucleating propensity and deleted it by genetic engineering. Thus they produced a strain of P. syringae that did not provoke ice crystals to form; they called it “ice-minus.” Lindow’s plan was to spray it on fields and or­chards in the hope of replacing the harmful bacterium with its man-made “ice-minus” cousin, thus protecting the crops. But he was to be thwarted when an environmentalist came on the scene with a restraining order from a federal court.

The New Debate About Safety

Jeremy Rifkin, author and activist, won court rulings in 1983 and 1984 that prevented Steve Lindow from testing his “ice­minus” bacteria in the open air. For more than two years, as the law took its course, genetic engineering for agricultural purposes remained suspended on the brink of its first field trials. Rifkin’s name was reviled in some quarters. Academic scientists resented the doubts cast on their own judgment; agro­chemical businessmen investing heavily in genetic engineering complained about the threat to their profits. But the more loudly the new biologists proclaimed their ability to change the world, the greater was the onus on them to convince their fellow inhabitants of the planet that they would not wreck it.

Genetic engineering made many people deeply uneasy. They had moral scruples about the sanctity of natural life, ob­jected to scientists playing God, worried about the likelihood that humans would become subjects for genetic engineering, and expressed practical anxieties about dangerous man-made organisms breaking loose. There was no doubt about the power that provoked these fears. An advertisement from a Swedish company appearing in scientific journals in 1985 said it all:

At lastl An easy-to-use gene machine at an affordable price. Now any laboratory working with oligonucleotides can perform auto­mated DNA synthesis simply, reliably and affordably-whatever the volume demand-with the new Pharmacia Gene Assern­bler™.

A decade earlier, in the great debate about recombinant DNA, the scientists had allayed public fears by sealing in poten­tially dangerous experiments. Now they were proposing to go to the opposite extreme, by scattering genetically engineered organisms across the landscapes of the world. During the mora­torium that Rifkin’s legal battle imposed on agricultural bio­technology, many of the sharpest questions came from scien­tists of repute.

The concern was not confined to the U.S. The release of genetically engineered organisms into the environment was prohibited in a number of countries, including Switzerland where Ken Timmis was developing his antipollution bacteria. The Organization for Economic Cooperation and Develop­ment in Paris, which linked the rich industrial nations of North America, Europe, and Japan, labored to try to produce guide­lines for safety and regulation in biotechnology. The Reagan administration thought its European partners were taking too negative a view of the new opportunities in biotechnology.

The opinion spectrum extended from those who detested the very idea of genetic engineering, to industrialists who wanted no regulations at all. Molecular biologists and profes­sional ecologists argued the issues at scientific meetings in Phila­delphia and Helsinki during 1985. In these discussions it be­came plain that grounds for great caution existed, but probably not for a total ban on the release of engineered organisms.

Imaginative horror stories helped to focus the mind. Suppose a new nitrogen-fixing bacterium were so successful that it cov­ered the oceans with a deadly scum. Or visualize a genetically engineered wood-eating bug equipped with the potent ligni­nase enzyme escaping from a factory and destroying all the world’s forests. Or consider, as Rifkin himself argued, that Lindow’s “ice-minus” bacterium might so spread and multiply as to interfere with the natural processes of ice formation in the atmosphere, altering the world’s climate.

Although gung-ho genetic engineers dismissed them as sci­ence fiction, such scenarios made salutary points. First, a very small chance of a very great catastrophe ought to be treated as a non-negligible risk. Again, organisms were no respecters of national frontiers, and misguided operations in one country could bring disaster to many others. And in the narratives of the horror stories the perpetrator was often an irresponsible scientist who did not tell his colleagues what he was doing.

Much depended on candor and openness of the kind that Lindow himself displayed, but that in turn relied on the exis­tence of free societies. Even there, commercial secrecy could conceal the uses being made of engineered organisms. Differ­ences in policies between the nations showed that the anarchi­cal organization of the world into nation-states offered scant protection. If a truly mad genetic engineer were bent on a truly dangerous experiment he could find a haven somewhere on the planet.

Wise biologists remembered disasters from pre-DNA days: the European rabbits breaking loose in Australia; the water hyacinth from South America, introduced into Africa as an ornamental plant, spreading to clog the rivers and lakes of that continent; the African killer bees that escaped from a genetics lab in South America and, again, took over a continent. These events were not science fiction, even though they sounded like it. Nevertheless, genetically engineered plants and animals would at least be visible and perhaps killable, if they threatened to get out of hand.

That could not be said for genetically engineered microbes released into the environment. In some forms, microbes could not even be detected by culturing in the laboratory. And their capacity to spread was notorious. A scientific critic of the pro­posed field trails of the “ice-minus” bacterium, David Pimentel of Cornell University, pointed out that insects picked up the normal Pseudomonas syringae bacteria from the plants where it lived. If, as a result of the experiments, harmful insects be­came less vulnerable to freezing, or beneficial insects such as honeybees were made more vulnerable, the consequences could be grave.

Bacteria were also far more likely than plants or animals to engage in natural genetic engineering on their Own account. The transfers of genes that conferred resistance to antibiotics to many species of bacteria were a case in point. Ken Timmis even advertised his hope that, if he introduced gene-spliced bacteria for curing methylchlorophenol pollution, they would pass on their talent to other soil bacteria. Robert Goodman of Calgene was one of many who argued that the only differ­ence between what molecular biologists were doing in the lab and what happened in nature was that bacteria were better at it than the scientists. The fact remained that gene swapping between bacteria meant that the ultimate destinations of engi­neered genes were unpredictable.

A consensus that emerged between moderate gene-splicers and moderate ecologists was that a total ban on the release of genetically engineered organisms was unreasonable in view of the benefits they might bring; on the other hand, great pru­dence was needed and ought to be enforced with some mea­sures of regulation similar to those governing the introduction to the marketplace of new foodstuffs and pharmaceuticals. De­lays were necessary for thought, for scientific debate, and for appropriate laboratory experiments and field studies. Ecologists were, in the end, reassured about the “ice-minus” bacteria when mutants of Pseudomonas syringae turned up in the wild. These also lacked the capacity to nucleate ice crystals and seemed to be doing no harm. In a backhanded fashion, Nature approved the experiments by showing them to be unoriginal.

Rifkin struck again in 1986, in a lawsuit to try to block the sale of a genetically engineered live-virus vaccine against pseu­dorabies in swine. The U.S. Department of Agriculture had violated the rules in not notifying its own Agricultural Recombi­nant DNA Research Committee before approving the vaccine. Meanwhile, in Britain, David Bishop of the Institute of Virology in Oxford won official permission to spray a hundred trees in northern Scotland with a modified virus active against larvae of the pine moth. In the first instance, the genetic engineers merely marked the virus with an ineffectual but distinctive genetic tag, so that they could trace how it spread in the envi­ronment. This experiment was intended as a prudent prelimi­nary to arming the virus with toxin-producing genes.

The prospect, then, was of never-ending scientific and administrative examination of proposals for releasing new organisms. As case law accumulated, wisdom might deepen. But while many Europeans wanted the onus of proof to rest heavily on the inventors of new organisms, the U.S. administration preferred a framework in which it was up to the regulatory agencies to intervene if they saw a risk. At the Savannah con­gress on plant molecular biology, government officials were at pains to reassure the scientists that they wanted to impede them as little as possible. National differences in policy and secrecy in commercial operations seemed likely to frustrate the good intentions of the scientists. For better or worse, yet another scientific genie was out of its bottle and refusing to go back.

Just as concern about civilian power plants often distracted attention from the far graver matter of nuclear weapons, so the question of military secrecy surrounding genetic engineer­ing tended to be overlooked in the debate. When the Public Opinion Laboratory of Northern Illinois University polled religious, environmental, and science policy leaders about atti­tudes to recombinant-DNA research, in 1984, only 5 percent mentioned weapons as a risk, compared with 69 percent who visualized the possible creation of undesirable organisms. Had the U.S. Air Force possessed forest-eating bacteria during the Vietnam War, would it have refrained from using them in pref­erence to the 2,4,5-T defoliant, to clear the vegetation that concealed the Vietnamese enemies?

The outcome

Policies on the use of genetically modified crops are still in disarray, with quite differing attitudes in Europe and the USA. But perhaps the warm-up debate on this subject has prepared the ground for dealing sensibly with wholly novel organisms.

Venter has fierce critics, but I’m reassured by the way he answers them in a BBC report. He said he was “driving the discussions” about the regulations governing this relatively new scientific field and about the ethical implications of the work.

“In 2003, when we made the first synthetic virus, it underwent an extensive ethical review that went all the way up to the level of the White House. And there have been extensive reviews including from the National Academy of Sciences, which has done a comprehensive report on this new field. We think these are important issues and we urge continued discussion that we want to take part in.”

A footnote on the reporting

I laugh when I see today how journalists will keep referring to Craig Venter as a “controversial” or “maverick” scientist. This goes back to the late 1990s and the race to decode the human genome, between huge multinational publicly-funded teams and Venter with private funding and his “shotgun” technique. Here’s how Magic Universe begins that tale, in the story about the Human Genome.

DNA Alley they call it, and it is Maryland’s answer to California’s Silicon Valley, although offering biotechnology instead of microchips. The axis is Interstate 270, running through leafy outer suburbs of Washington DC towards Germantown. Turn off for Rockville, find Guide Drive, and after a few stoplights you’ll come to the white buildings of Celera Genomics. This is Maryland’s equivalent of Intel, housing what was for a while the most dazzling and most reviled enterprise in the history of biology.

Celera derived its name from the Latin celer, or swift, and it trademarked a company slogan, ‘Discovery can’t wait’. As the 21st Century dawned, 300 automated Perkin-Elmer sequencers at Celera were reading the coded genetic messages of human beings. These were written in the sequences of chemical letters A, T, C and G in the long chains of deoxyribonucleic acid, or DNA, that make up the genes of heredity.

The entire set of human genetic material, called the genome, consists of 3.2 billion letters strung out along two metres of DNA. The fineness of the molecular thread allows all that length to be scrunched up and packaged in 23 pairs of chromosomes at the heart of microscopic cells throughout the human body. At Rockville the DNA in all the chromosomes was tackled in one great spree, by the shotgun method.

That meant taking DNA from five individuals and chopping it with enzyme scissors into more than 20 million fragmented but overlapping sequences. The chromosome fragments were inserted by genetic engineering into bacteria, to produce millions of copies of each. The analysers read off the letters in the fragments. From these, a remarkable system of 800 interconnected computers found the overlaps and stitched the genes — or at least their codes — back together again.

The swiftness was breathtaking. Craig Venter, who masterminded Celera, had previously worked just down the road at the National Institute of Neurological Disorders and Stroke in Bethesda. While there he spent 10 years looking for a single gene. His computers at Rockville could do the job in 15 seconds.

Venter was deeply unpopular because he accomplished in 18 months, with 280 colleagues, what 3000 scientists in 250 laboratories around the world had taken 13 years to do. The Human Genome Project, funded by governments and charitable foundations, aimed at completion by 2005. In response to the Celera challenge it speeded up the work with extra funding.

Leaders of the public project in the USA had turned down Venter’s offer to work with them, saying that the whole-genome shotgun method he was proposing was impossible or hopelessly unreliable. When he was offered private funding to get on with it, the critics made Venter out to be an entrepreneur interested only in profit. Stories were planted in the media that Celera was going to patent all human genes.

Venter demonstrated the whole-genome method with the somewhat smaller genome of the fruit fly, Drosophila melanogaster. Early in 2000, when his team had sequenced 90 per cent of the human genome, Venter realized that he could save much time and money in piecing it together if he took in data that the accelerated public programme was releasing earlier than expected. The borrowing let the public-sector scientists to claim that their original judgment was right and the whole-genome method didn’t work. This led to pantomime exchanges of the form, ‘Oh yes it did!’ and ‘Oh no it didn’t!’

The most open-minded onlookers found it hard to judge how much of the opposition to Venter was motivated by the wish to safeguard the public interest in the genome, and how much was professional pique or poor sportsmanship. The idea of ‘public good, private bad’ was questionable, to put it mildly. The public labs working on the human genome did not scruple to use a crucial method for multiplying their DNA samples invented by the Cetus Corporation. In a decade or two what will be remembered is that Venter’s method worked unaided in other species and his private intervention gave a much-needed shot of adrenalin to the public human-genome effort.

References

D.G. Gibson et al., “Creation of a Bacterial Cell Controlled by a Chemically Synthesized Genome” Science Express 20 May 2010

Dyson tree – see e.g. F.J. Dyson, “Warm-blooded plants and freeze-dried fish: the future of space exploration.” The Atlantic Monthly, November 1997

N. Calder, The Green Machines, pp. 92-98, Putnam, 1986

BBC quoting Venter http://news.bbc.co.uk/1/hi/science_and_environment/10132762.stm

N. Calder, Magic Universe, p. 401-02, Oxford UP, 2003

3 Responses to Artificial life

  1. […] original here: Artificial life « Calder's Updates By admin | category: University of GENEVA | tags: break-down, colleagues, collider, […]

  2. Mark says:

    As a teenager in 1987, your book The Green Machines was a favorite. Thanks.

    With the interest in relocalization, eco-tech, resilient communities, transition towns etc. why not a new edition? I thought 2006 would’ve been a great time but now is better than never.

    • calderup says:

      Thanks, Mark.

      The Green Machines is on the list for Predictions Revisited. When I get around to that, I’ll mull over your kind suggestion.

      Nigel

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: