Quantum computing forges ahead

08/03/2012

Updating Magic Universe

So what’s a Majorana fermion then?

A news item in today’s Nature reminds me that last week it was all happening with quantum computing at a meeting of the American Physical Society. IBM announced a breakthrough in the technology, predicting practical computers of unimaginable power within 10 or 15 years. And in Nature Eugenie Samuel Reich discusses what seems to be a discovery of cosmic importance by a team in Delft, announced at the APS meeting. I’ll sum up two strands of progress in a brief update.

In Magic Universe the last section of the story called “BITS AND QUBITS: the digital world and its quantum shadow looming” reads so far:

Towards quantum computers

For a second revolution in information technology, the experts looked to the spooky behaviour of electrons and atoms known in quantum theory. By 2002 physicists in Australia had made the equivalent of Shannon’s relays of 65 years earlier, but now the switches offered not binary bits, but qubits, pronounced cue-bits. They raised hopes that the first quantum computers might be operating before the first decade of the new century was out.

   Whereas electric relays, and their electronic successors in microchips, provide the simple on/off, true/false, 1/0 options expressed as bits of information, the qubits in the corresponding quantum devices will have many possible states. In theory it is possible to make an extremely fast computer by exploiting ambiguities that are present all the time, in quantum theory.

   If you’re not sure whether an electron in an atom is in one possible energy state, or in the next higher energy state permitted by the physical laws, then it can be considered to be both states at once. In computing terms it represents both 1 and 0 at the same time. Two such ambiguities give you four numbers, 00, 01, 10 and 11, which are the binary-number equivalents of good old 0, 1, 2 and 3. Three ambiguities give eight numbers, and so on, until with fifty you have a million billion numbers represented simultaneously in the quantum computer. In theory the machine can compute with all of them at the same time.

   Such quantum spookiness spooks the spooks. The world’s secret services are still engaged in the centuries-old contest between code-makers and code-breakers. There are new concepts called quantum one-time pads for a supposedly unbreakable cipher, but some experts suspect that a powerful enough quantum computer could crack anything. Who knows what developments may be going on behind the scenes, like the secret work on digital computing by Alan Turing at Bletchley Park in England during the Second World War?

   The Australians were up-front about their intentions. They simply wanted to beat the rest of the world in developing a practical machine, for the sake of the commercial payoff it would bring. The Centre for Quantum Computer Technology was founded in January 2000, with federal funding, and with participating teams in the Universities of New South Wales, Queensland and Melbourne.

   The striking thing was the confidence of project members about what they were attempting. A widespread opinion at the start of the 20th Century held that quantum computing was beyond practical reach for the time being. It was seen as requiring exquisite delicacy in construction and operation, with the ever-present danger that the slightest external interference or mismanagement could cause the whole multiply parallel computation to cave in, like a mistimed soufflé.

   The qubit switches developed in Australia consist of phosphorus atoms implanted in silicon using a high-energy beam aimed with high precision. Phosphorus atoms can sustain a particular state of charge for longer than most atoms, thereby reducing the risk of the soufflé effect. A pair of phosphorus atoms, together with a transistor for reading out their state, constitutes one qubit. Unveiling the first example at a meeting in London, Robert Clark of New South Wales said, ‘This was thought to be impossible just a few years ago.’

Update March 2012 – subject to confirmation of the Majorana fermion

Ten years later, when many others had joined in a prolonged experimental quest for quantum computing, IBM researchers at Yorktown Height s claimed to be within sight of a practical device within 10 or 15 years. Dogging all the experimenters was a problem called decoherence  – would the qbits survive long enough to be checked for possible errors?

In 2012 Matthias Steffen of IBM told a reporter, “In 1999, coherence times were about 1 nanosecond.  Last year, coherence times were achieved for as long as 1 to 4 microseconds. With [our] new techniques, we’ve achieved coherence times of 10 to 100 microseconds. We need to improve that by a factor of 10 to 100 before we’re at the threshold [where] we want to be. But considering that in the past ten years we’ve increased coherence times by a factor of 10,000, I’m not scared.”

Then it would be a matter of scaling up from devices handling one or two qbits to an array with, say, 250 qubits., That would contain more ordinary bits of information than there are atoms in the entire universe and it would be capable of performing millions of computations simultaneously. No existing code could withstand its probing, which probably explains why the US Army funded IBM’s work.

Ettore Majorana - CERN image

A by-product of quantum computing research was the discovery of a new particle in the cosmos. In 1937  the Italian physicist Ettore Majorana adapted a theory by the British Paul Dirac to predict a particle that is its own antiparticle – a very strange item indeed! It would be electrically neutral and exhibit peculiar behaviour.

A team led by Leo Kouwenhoven at Delft University of Technology in the Netherland, tested experimentally a suggestion from 2010 about how to create a pair of these particles. At a very low temperature and in a magnetic field, you touch a superconductor with an extremely fine semiconducting wire. As the signature of the presence of “Majorana fermions”, confirmed by the experimental team, the resistance in the wire becomes very low at zero voltage.

The Majorana particle opened a new route to quantum computing, because of its special ability to remember if it swaps places with a sibling. It was expected to be particularly resistant to the decoherence that plagued other techniques. So the Delft discovery promised a new research industry.

References

Steffen quoted by Alex Knapp in Forbes 28 February 2012 http://www.forbes.com/sites/alexknapp/2012/02/28/ibm-paves-the-way-towards-scalable-quantum-computing/

IBM Press Release 28 February 2012 http://www-03.ibm.com/press/us/en/pressrelease/36901.wss

Nature News 8 March 2012: http://www.nature.com/news/a-solid-case-for-majorana-fermions-1.10174

Nature News 28 Feb 2012 http://www.nature.com/news/quest-for-quirky-quantum-particles-may-have-struck-gold-1.10124

“A suggestion from 2010”: paper by Lutchyn et al. in PRL available at arXiv:1002.4033v2


Sausages without the pig

14/11/2011

Predictions revisited

Food production by tissue engineering

This drawing by Nik Spencer shows an as-yet unrealised concept of Morris Benjaminson at Touro College, New York, It introduces the theme, rather than illustrating the work in the Netherlands noted below. The source is an article by Nicola Jones in Nature 468, 752-753 (2010) and you can see a larger and more legible version here http://www.nature.com/news/2010/101208/full/468752a/box/1.html

I’d still like to trace just where the idea originated. I know that In 1967 I was predicting “beef-steak without a cow” in The Environment Game, a book that visualized the land areas needed for agriculture being greatly reduced. In 1983, my contribution to The Future of a Troubled World, pictured “endless sausages growing by tissue culture of pork muscle”. But now I learn that Winston Churchill was talking about “chicken breast without the chicken” back in 1931. Where did he get it from? I’ll go on checking.

The Churchill quote comes in a segment in “Brave New World with Stephen Hawking” on Channel 4 (14 November). It follows up stories of the past few years about developments, most notably in the Netherlands, that are gradually making it a reality.

Here’s what I wrote in 1967:

Tissue culture itself is one of the most attractive ideas for artificial food production. It is no longer far-fetched to think that we may learn how to grow beef-steak, for example, without a cow. Tissue culture, the technique for growing cells outside the organism from which they originated, is already used for special purposes in research and also for growing viruses in the manufacture of vaccine; the advent of polio vaccine depended on the successful cultivation of kidney cells. That in turn followed the introduction of antibiotics to preserve the cultures from the ravages of stray micro-organisms.

When cells are cultured by present techniques they tend to lose their specialized character. By deliberately letting specialized cells such as kidney or muscle revert to the undifferentiated nature of a newly fertilized egg, we can use them in a quite arbitrary way for a variety of synthetic purposes. If, on the other hand, we want to grow beef-steak we must simulate the conditions governing the growth and arrangement of the cells in the live animal, otherwise we shall finish up with something like finely divided mince.

The Netherlands launched a well-funded multi-university project in 2005, and in the Hawking show, Mark Evans visits Mark Post at Maastricht University who shows him muscle fibres forming artifically. Post has some commerical backing and declares himself “reasonably confident” that next year (2012) he’ll make a hamburger. But with a price tag on the burger at 250,000 euros it’s “still in the scientific phase”.

Besides reducing the land areas for meat production, eventual success with “in-vitro meat” will mean that astronauts bound for Mars can still have their burgers, sausages and chicken breast.

Added 15 November.

Suspecting that J.B.S. Haldane might have been an early predictor of synthetic food, I’ve dug out this 1923 lecture http://www.marxists.org/archive/haldane/works/1920s/daedalus.htm But he visualizes synthesis from scratch. “Many of our foodstuffs, including the proteins, we shall probably build up from simpler sources such as coal and atmospheric nitrogen.”

A step in the extrapolation to tissue culture seems to have come from Haldane’s evolutionist chum, Julian Huxley, in fictional form in a short story, “The Tissue-Culture King” (1927). There the tissue of an African ruler was proliferated in that way – but for power, not for food. So where did Churchill get his rather precise prediction from? En route I’ve found the Churchill source. In Fifty Years Hence, In Thoughts and Adventures, Thornton Butterworth (London) 1932, Winston wrote:

“Fifty years hence we shall escape the absurdity of growing a whole chicken in order to eat the breast or wing by growing these parts separately under a suitable medium.”

References

Nicola Jones, Nature, 468, 752-753, 2010

Nigel Calder, The Environment Game, Secker & Warburg (London) 1997

Ritchie Calder (ed), The Future of a Troubled World, Heinemann (London) 1983

Channel 4 (London), “Brave New World with Stephen Hawking,” Part 4, Environment, 14 November 2011. See http://www.channel4.com/programmes/brave-new-world-with-stephen-hawking/4od#3253407


Editors’ personal opinions

27/10/2011

Climate Change – News and Comments

Nature muddies the water

As a science writer I’m well used to picking my way through the minefield of embargoes on papers not yet published. I know, too, of possible risks to scientists as well as journalists, when quoting from preprints or even reporting results presented at a conference. Publication can be cancelled.

You’d expect clear guidance from leading journals on that subject. How bewildering then, to read an editorial “Scientific climate” in today’s Nature (vol. 478, p. 428). It’s on the subject of the Berkeley Earth / Richard Muller furore noted in my recent posts. The editorial’s sub-heading is:

Results confirming climate change are welcome, even when released before peer review.

… Where “climate change” is to be understood, I suppose, as “catastrophic manmade global warming”. Other points from the editorial are, as I construe them:

  • The welcome is the stronger because the Muller results can be used against the Republicans in the USA.
  • But Muller really should not have publicised his work as he did.
  • Muller is wrong to claim that Science and Nature forbid the discussion of unpublished results – Nature only opposes pre-publicity.
  • All that said, it was fine for physicists to give pre-publicity to apparent evidence of neutrinos travelling faster than light.

What on earth does all that mean, to scientists and journalists who are just trying to tell their stories promptly? Here are three extracts from Nature’s instructions to authors concerning embargoes, which can be seen in full here http://www.nature.com/authors/policies/embargo.html

Material submitted to Nature journals must not be discussed with the media, except in the case of accepted contributions, which can be discussed with the media no more than a week before the publication date under our embargo conditions. We reserve the right to halt the consideration or publication of a paper if this condition is broken.”

The benefits of peer review as a means of giving journalists confidence in new work published in journals are self-evident. Premature release to the media denies journalists that confidence. It also removes journalists’ ability to obtain informed reactions about the work from independent researchers in the field.”

… communicate with other researchers as much as you wish, whether on a recognised community preprint server, on Nature Precedings, by discussion at scientific meetings (publication of abstracts in conference proceedings is allowed), in an academic thesis, or by online collaborative sites such as wikis; but do not encourage premature publication by discussion with the press (beyond a formal presentation, if at a conference).”

What the new editorial means, in my opinion, is that the politicisation of science has now penetrated right through to the workaday rituals of publication. On no account must you publicise your new work prematurely, unless you do it to bash the climate sceptics or the Republican Party or supporters of Special Relativity or anyone else the editors happen to dislike today. In that case they’ll forgive you.


Do clouds disappear? (4)

10/09/2011

Climate Change – News and Comments

Falsification tests of climate hypotheses

Warmer days and cooler nights when cosmic rays are scarce

Here’s a reminder of a climatic footnote to the 9/11 terrorist attacks on New York and Washington DC, ten years ago this weekend. With civilian aircraft grounded for three days, and without the contrails that usually criss-cross the skies of the USA, the difference between daytime and night-time temperatures at the surface increased. (See the Travis reference below.) Apparently like many other clouds (not all) the contrails reduce sunshine during the day and blanket the loss of heat at night. Take away those man-made clouds and the days become a little warmer and the nights a little cooler. In the jargon: the diurnal temperature range (DTR) increases.

(((Remark added 11 September. With comments coming in that cast doubt on that contrail story, I’ll repeat part of what I said in reply to Dahuang below. It doesn’t really matter what the reason was, for the post-9/11 increase in DTR, as long as everyone accepts that a loss of cloud was involved.)))

An echo of that mini-climatic event comes with the news that the DTR in Europe increases when there’s a big reduction in cosmic rays arriving at the Earth. With the implication that the skies are less cloudy at such times, it’s strong evidence in favour of Henrik Svensmark’s hypothesis that cosmic rays help to make clouds. The report comes from Aleksandar Dragić and his colleagues at the Institute of Physics in Belgrade. I’m grateful to Bengt Andersson for drawing their paper to my attention. It was published on 31 August and the full text is available here http://www.astrophys-space-sci-trans.net/7/315/2011/astra-7-315-2011.pdf  It’s typical of the pathetic state of science reporting that I still seem to have the story to myself ten days later.

More than a year ago I began a succession of posts on whether or not observations in the real world support or falsify the Svensmark hypothesis. The most explanatory was the first – see http://calderup.wordpress.com/2010/05/03/do-clouds-disappear/

The focus was on the “natural experiments” in which big puffs of gas from the Sun block some of the cosmic rays coming from the Galaxy towards the Earth. The resulting falls in cosmic ray influx, called Forbush decreases, last for a few days. The game is to look for observable reductions in cloudiness in the aftermath of these events. The results are most clearly favourable to the Svensmark hypothesis for the Forbush decreases with the largest percentage reductions in cosmic rays. Scientists keen to falsify the hypothesis have only to mix in some of the weaker events for the untidiness of the world’s weather to “hide the decline”.

The Serbs avoid that blunder by picking out the strongest Forbush decreases. And by using the simple, reliable and long-provided weather-station measurements of temperature by night and day, they avoid technical, interpretive and data-availability problems that surround more direct observations of clouds and their detailed properties. The temperatures come from 184 stations scattered all across Europe (actually, so I notice, from Greenland to Siberia). A compilation by the Mount Washington Observatory that spans four decades, from 1954 to 1995, supplies the catalogue of Forbush decreases.

The prime results are seen here in Dragić et al.‘s Figure 5. The graphs show the increase in the diurnal temperature range averaged across the continent in the days following the onset of cosmic ray decreases (day 0 on the horizontal scales). The upper panel is the result for 22 Forbush events in the range 7−10%, with a peak at roughly +0.35 oC in the diurnal temperature range. The lower panel is for 13 events greater than 10%. The peak goes to +0.6 oC and the influence lasts longer. It’s very satisfactory for the Svensmark hypothesis that the effect increases like this, with greater reductions in the cosmic rays. The results become hard (impossible?) to explain by any mechanism except an influence of cosmic rays on cloud formation.

To be candid, these results are much better than I’d have expected for observations from a densely populated continent with complex weather patterns, where air pollution and effects of vegetation confuse the picture of available cloud condensation nuclei. Svensmark’s team has emphasised the observable effects over the oceans. Now the approach taken by the Belgrade team opens the door to similar investigations in other continents. Let a march around the world’s land masses begin!

References

USA: diurnal temperatures post-9/11

D.J. Travis, A. Carleton and R.G. Lauritsen, “Contrails reduce daily temperature range”, Nature 418, 601, 2002

Europe: diurnal temperatures after Forbush decreases

A. Dragić, I. Aničin, R. Banjanac, V. Udovičić, D. Joković´, D. Maletić and J. Puzović, “Forbush decreases – clouds relation in the neutron monitor era”, Astrophysics and Space Sciences Transactions, 7, 315–318, 2011.


CERN experiment confirms cosmic ray action

24/08/2011

Climate Change – News and Comments

The global warmists’ dam breaks

A graph they'd prefer you not to notice. Tucked away near the end of online supplementary material, and omitted from the printed CLOUD paper in Nature, it clearly shows how cosmic rays promote the formation of clusters of molecules (“particles”) that in the real atmosphere can grow and seed clouds. In an early-morning experimental run at CERN, starting at 03.45, ultraviolet light began making sulphuric acid molecules in the chamber, while a strong electric field cleansed the air of ions. It also tended to remove molecular clusters made in the neutral environment (n) but some of these accumulated at a low rate. As soon as the electric field was switched off at 04.33, natural cosmic rays (gcr) raining down through the roof of the experimental hall in Geneva helped to build clusters at a higher rate. How do we know they were contributing? Because when, at 04.58, CLOUD simulated stronger cosmic rays with a beam of charged pion particles (ch) from the accelerator, the rate of cluster production became faster still. The various colours are for clusters of different diameters (in nanometres) as recorded by various instruments. The largest (black) took longer to grow than the smallest (blue). This is Fig. S2c from supplementary online material for J. Kirkby et al., Nature, 476, 429-433, © Nature 2011

Long-anticipated results of the CLOUD experiment at CERN in Geneva appear in tomorrow’s issue of the journal Nature (25 August). The Director General of CERN stirred controversy last month, by saying that the CLOUD team’s report should be politically correct about climate change (see my 17 July post below). The implication was that they should on no account endorse the Danish heresy – Henrik Svensmark’s hypothesis that most of the global warming of the 20th Century can be explained by the reduction in cosmic rays due to livelier solar activity, resulting in less low cloud cover and warmer surface temperatures.

Willy-nilly the results speak for themselves, and it’s no wonder the Director General was fretful.

Jasper Kirkby

Jasper Kirkby of CERN and his 62 co-authors, from 17 institutes in Europe and the USA, announce big effects of pions from an accelerator, which simulate the cosmic rays and ionize the air in the experimental chamber. The pions strongly promote the formation of clusters of sulphuric acid and water molecules – aerosols of the kind that may grow into cloud condensation nuclei on which cloud droplets form. What’s more, there’s a very important clarification of the chemistry involved.

A breach of etiquette

My interest in CLOUD goes back nearly 14 years, to a lecture I gave at CERN about Svensmark’s discovery of the link between cosmic rays and cloudiness. It piqued Kirkby’s curiosity, and both Svensmark and I were among those who helped him to prepare his proposal for CLOUD.

By an unpleasant irony, the only Svensmark contribution acknowledged in the Nature report is the 1997 paper (Svensmark and Friis-Christensen) on which I based my CERN lecture. There’s no mention of the successful experiments in ion chemistry and molecular cluster formation by the Danish team in Copenhagen, Boulby and latterly in Aarhus where they beat CLOUD to the first results obtained using a particle beam (instead of gamma rays and natural cosmic rays) to ionize the air in the experimental chamber – see http://calderup.wordpress.com/2011/05/17/accelerator-results-on-cloud-nucleation-2/

What will historians of science make of this breach of scientific etiquette? That Kirkby was cross because Svensmark, losing patience with the long delay in getting approval and funding for CLOUD, took matters into his own hands? Or because Svensmark’s candour about cosmic rays casting doubt on catastrophic man-made global warming frightened the national funding agencies? Or was Kirkby simply doing his best (despite the results) to obey his Director General by slighting all things Danish?

Personal rivalries aside, the important question is what the new CLOUD paper means for the Svensmark hypothesis. Pick your way through the cautious prose and you’ll find this:

Ion-induced nucleation [cosmic ray action] will manifest itself as a steady production of new particles [molecular clusters] that is difficult to isolate in atmospheric observations because of other sources of variability but is nevertheless taking place and could be quite large when averaged globally over the troposphere [the lower atmosphere].”

It’s so transparently favourable to what the Danes have said all along that I’m surprised the warmists’ house magazine Nature is able to publish it, even omitting the telltale graph shown at the start of this post. Added to the already favourable Danish experimental findings, the more detailed CERN result is excellent. Thanks a million, Jasper.

Enlightening chemistry

And in friendlier times we’d be sharing champagne for a fine discovery with CLOUD, that traces of ammonia can increase the production of the sulphuric clusters a thousandfold. It’s highlighted in the report’s title: “Role of sulphuric acid, ammonia and galactic cosmic rays in atmospheric aerosol nucleation” and it was made possible by the more elaborate chemical analysis in the big-team set-up in Geneva. In essence, the ammonia helps to stabilize the molecular clusters.

Although not saying it openly, the CLOUD team implies a put-down for the Danes with this result, repeatedly declaring that without ammonia there’d be little cluster production at low altitudes. But although the Aarhus experimenters did indeed assume the simpler reaction (H2SO4 + H2O), differing results in successive experimental runs made them suspect that varying amounts of trace impurities were present in the air cylinders used to fill their chamber. Now it looks as if a key impurity may have been ammonia. But some members of the CLOUD consortium also favoured (H2SO4 + H2O) and early runs in Geneva used no intentional ammonia. So they’ve little reason to scoff.

In any case, whether the basic chemistry is (H2SO4 + H2O) or (H2SO4 + H2O + NH3) is an academic rather than a practical point. There are always traces of ammonia in the real air, and according to the CLOUD report you need only one molecule in 30 billion. If that helps to oil Svensmark’s climatic motor, it’s good to know, but it calls for no apologies and alters the climatic implications not a jot.

The experiment's logo. The acronym “Cosmics Leaving Outdoor Droplets” always implied strong interest in Svensmark's hypothesis. And the roles of the Galaxy and the Sun are acknowledged.

Technically, CLOUD is a welcome advance on the Danish experiments. Not only is the chemistry wider ranging but molecular clusters as small as 1.7 nanometres in diameter are detectable, compared with 4 nm in Denmark. And the set-up enables the scientists to study the ion chemistry at lower temperatures, corresponding to increasing altitudes in the atmosphere. Cluster production soars as the temperature goes down, until “almost every negative ion gives rise to a new particle” [i.e. molecular cluster]. The lowest temperature reported in the paper is -25 oC. That corresponds to an altitude of 6000 metres, so unless you wish to visualize a rain of cloud-seeding aerosols from on high, it’s not very relevant to Svensmark’s interest in the lowest 3000 metres.

How the warmists built their dam

Shifting from my insider’s perspective on the CLOUD experiment, to see it on the broader canvas of the politicized climate science of the early 21st Century, the chief reaction becomes a weary sigh of relief. Although they never said so, the High Priests of the Inconvenient Truth – in such temples as NASA-GISS, Penn State and the University of East Anglia – always knew that Svensmark’s cosmic ray hypothesis was the principal threat to their sketchy and poorly modelled notions of self-amplifying action of greenhouse gases.

In telling how the obviously large influences of the Sun in previous centuries and millennia could be explained, and in applying the same mechanism to the 20th warming, Svensmark put the alarmist predictions at risk – and with them the billions of dollars flowing from anxious governments into the global warming enterprise.

For the dam that was meant to ward off a growing stream of discoveries coming from the spring in Copenhagen, the foundation was laid on the day after the Danes first announced the link between cosmic rays and clouds at a space conference in Birmingham, England, in 1996. “Scientifically extremely naïve and irresponsible,” Bert Bolin declared, as Chairman of the Intergovernmental Panel on Climate Change.

As several journalists misbehaved by reporting the story from Birmingham, the top priority was to tame the media. The first courses of masonry ensured that anything that Svensmark and his colleagues might say would be ignored or, failing that, be promptly rubbished by a warmist scientist. Posh papers like The Times of London and the New York Times, and posh TV channels like the BBC’s, readily fell into line. Enthusiastically warmist magazines like New Scientist and Scientific American needed no coaching.

Similarly the journals Nature and Science, which in my youth prided themselves on reports that challenged prevailing paradigms, gladly provided cement for higher masonry, to hold the wicked hypothesis in check at the scientific level. Starve Svensmark of funding. Reject his scientific papers but give free rein to anyone who criticizes him. Trivialize the findings in the Holy Writ of the Intergovernmental Panel on Climate Change. None of this is paranoia on my part, but a matter of close personal observation since 1996.

It’s the Sun, stupid!” The story isn’t really about a bunch of naughty Danish physicists. They are just spokesmen for the most luminous agent of climate change. As the Sun was what the warmists really wanted to tame with their dam, they couldn’t do it. And coming to the Danes’ aid, by briefly blasting away many cosmic rays with great puffs of gas, the Sun enabled the team to trace in detail the consequent reduction in cloud seeding and liquid water in clouds. See my post http://calderup.wordpress.com/2010/05/03/do-clouds-disappear/ By the way, that research also disposes of a morsel of doubt in the new CLOUD paper, about whether the small specks made by cosmic rays really grow sufficiently to seed cloud droplets.

As knowledge accumulated behind their dam and threatened to overtop it, the warmists had one last course to lay. Paradoxically it was CLOUD. Long delays with this experiment to explore the microchemical mechanism of the Svensmark effect became the chief excuse for deferring any re-evaluation of the Sun’s role in climate change. When the microchemical mechanism was revealed prematurely by the SKY experiment in Copenhagen and published in 2006, the warmists said, “No particle accelerator? That won’t do! Wait for CLOUD.” When the experiment in Aarhus confirmed the mechanism using a particle accelerator they said, “Oh that’s just the Danes again! Wait for CLOUD.”

Well they’ve waited and their dam has failed them.

Hall of Shame

Retracing those 14 years, what if physics had functioned as it is supposed to do? What if CLOUD, quickly approved and funded, had verified the Svensmark effect with all the authority of CERN, in the early 2000s. What if the Intergovernmental Panel on Climate Change had done a responsible job, acknowledging the role of the Sun and curtailing the prophecies of catastrophic warming?

For a start there would have no surprise about the “travesty” that global warming has stopped since the mid-1990s, with the Sun becoming sulky. Vast sums might have been saved on misdirected research and technology, and on climate change fests and wheezes of every kind. The world’s poor and their fragile living environment could have had far more useful help than precautions against warming.

And there would have been less time for so many eminent folk from science, politics, industry, finance, the media and the arts to be taken in by man-made climate catastrophe. (In London, for example, from the Royal Society to the National Theatre.) Sadly for them, in the past ten years they’ve crowded with their warmist badges into a Hall of Shame, like bankers before the crash.

References

J. Kirkby et al., Nature, 476, 429-433, 2011. The authors list and abstract are available at http://www.nature.com/nature/journal/v476/n7361/full/nature10343.html

 

H. Svensmark & E. Friis-Christensen, E., J. Atmos. Sol. Terr. Phys., 59, 1225–1232, 1997

Relevant Danish experimental reports since 2006, not cited in the new CLOUD paper

Henrik Svensmark, Jens Olaf Pepke Pedersen, Nigel Marsh, Martin Enghoff and Ulrik Uggerhøj, ‘Experimental Evidence for the Role of Ions in Particle Nucleation under Atmospheric Conditions’, Proceedings of the Royal Society A, Vol. 463, pp. 385–96, 2007 (online release 2006). This was the SKY experiment in a basement in Copenhagen.

Martin Andreas Bødker Enghoff; Jens Olaf Pepke Pedersen; Torsten Bondo, Matthew S. Johnson, Sean Paling and Henrik Svensmark, ‘Evidence for the Role of Ions in Aerosol Nucleation’, Journal of Physical Chemistry A, Vol: 112, pp. 10305-10309, 2008. Experiment in the Boulby deep mine in England.

M.B. Enghoff, J. O. Pepke Pedersen, U. I. Uggerhøj, S. M. Paling, and H. Svensmark, “Aerosol nucleation induced by a high energy particle beam,” Geophysical Research Letters, 38, L09805, 2011. Experiment with an accelerator in Aarhus.


superatomic circus

18/08/2010

Pick of the pics and Updating Einstein’s Universe & Magic Universe

Seeing the superatomic circus

When ultra-cold rubidium atoms club together in the superatoms called Bose-Einstein condensates, they usually make untidy crowds, as on the left. But a team led by Stefan Kuhr and Immanuel Bloch at the Max-Planck-Institut für Quantenoptik in Garching, Germany, brings them to order in a neater pattern, as seen in the middle picture. With more rubidium atoms the superatom grows wider (right). Criss-cross laser beams create a lattice-like pattern of pools of light where the atoms like to congregate. When the laser light’s electric field is relatively weak, the atoms jump (by quantum tunnelling) from one pool to another, creating the usual disorder. A stronger field, as in the central and right-hand images, fixes them in the novel state of matter called a Mott insulator. But atoms can be lost from the condensate, which explains the ring-like appearance on the right. Images from MPQ.

[You're recommended to click on the images for a better view]

Single atoms are located at the sites indicated by circles. Fig. 3 in Nature paper, Sherson et al. see ref.

What’s new here, in an advance online publication in Nature,  is not the creation of these kinds of  superatoms but the German team’s success in imaging them, with a specially developed microscope that picks up fluorescence from the atoms caused by the cooling process. In the image on the right individual atoms are pinpointed.

It’s exciting stuff, because we’re probably seeing the dawn of a new technology – after electronics comes “atomics”. If individual atoms in a superatom can be manipulated, they might be used to carry “addressable” information in an atomic computer.

Read the rest of this entry »


Guided hurricanes

17/08/2010

Predictions revisited and Climate Change: News and Comments

Guided hurricanes

When speculating four decades ago about the military uses of geophysics, Gordon J.F. MacDonald of UCLA contemplated the triggering of earthquakes or tsunamis, or melting polar ice with nuclear weapons. And he didn’t overlook the idea of steering hurricanes to ravage the enemy’s coasts. Reminding me of that prediction is a report now in press in Geophysical Research Letters, about how natural variations in the colour of the sea help to guide cyclones in the Pacific. A cyclone, remember, is a loosely used generic term that includes the major storms called hurricanes (Atlantic), typhoons (Pacific) or tropical cyclones (Indian Ocean and Australia).

Contributing to Unless Peace Comes, (1968), in a chapter entitled “How to Wreck the Environment”, MacDonald wrote:

… preliminary experiments have been carried out on the seeding of hurricanes. The dynamics of hurricanes and the mechanism by which energy is transferred from the ocean into the atmosphere supporting the hurricane are poorly understood. Yet various schemes for both dissipation and steering can be imagined. Although hurricanes originate in tropical regions, they can travel into temperate latitudes, as the residents of New England know only too well. A controlled hurricane could be used as a weapon to terrorize opponents over substantial parts of the populated world.

Read the rest of this entry »


Silly season for melting ice

22/07/2010

Climate Change: News and Comments

The Silly Season Again for Melting Ice

At this time of year, while the Arctic sea ice dwindles under the midnight sun and the wind pushes it around, silly stories are needed to fill the pages of summer newspapers. So it’s party time for the global warming alarmists and their editorial cronies. For example, Nature magazine today laments:

Arctic melting: The Arctic has set another record for losing sea ice. Last month saw the lowest extent of sea ice in the Arctic for any June since satellite records started in 1979.”

[Note added 25 July: The day after I posted this, CNSNews reported Sen. John Kerry (D-Mass.) as saying,  "Instead of waiting until 2030 or whenever it was to have an ice-free Arctic, we’re going to have one in five or 10 years.”]

It’s a replay of the polar stories of 2007, mentioned in a 2008 talk on the Tradecraft of Propaganda that I posted earlier: http://calderup.wordpress.com/2010/06/07/tradecraft-of-propaganda/

Here’s the relevant extract from that talk: Last year [2007] you were told – shock, horror! — that Arctic sea ice was at its lowest extent since satellite measurements began. How that news was trumpeted on television and radio and in all the newspapers! What went completely unreported was that simultaneously, at the other end of the world, Antarctic sea ice was at a record high. Although the big freeze in Antarctica was again plainly announced in a press release from the US weather bureau, NOAA, not a single newspaper in North America or Europe carried this news unfavourable to the global warming brigade.

Let’s check what’s going on this year, around the southern end of the Earth’s axis.

Read the rest of this entry »


Milankovitch back to 1974

10/07/2010

Climate Change: News and Comments

Milankovitch and the ice ages – welcome back to 1974

Why am I chuckling? After he’d had misgivings about the Milankovitch theory of the comings and goings of the ice sheets, Luboš Motl now says in The Reference Frame:

… the Milankovitch orbital cycles do describe the glaciation cycles in the recent 1 million years very well and nothing else – CO2 or random internal variations – is needed to account for the bulk of the data.”

You can read Motl’s story in full at http://motls.blogspot.com/2010/07/in-defense-of-milankovitch-by-gerard.html#more

– and download from there a 2006 paper that wins Motl over, by Gerard Roe of the University of Washington in Seattle.

The abstract of that paper reads (with my emphasis added):

The Milankovitch hypothesis is widely held to be one of the cornerstones of climate science. Surprisingly, the hypothesis remains not clearly defined despite an extensive body of research on the link between global ice volume and insolation changes arising from variations in the Earth’s orbit. In this paper, a specific hypothesis is formulated. Basic physical arguments are used to show that, rather than focusing on the absolute global ice volume, it is much more informative to consider the time rate of change of global ice volume. This simple and dynamically-logical change in perspective is used to show that the available records support a direct, zero-lag, antiphased relationship between the rate of change of global ice volume and summertime insolation in the northern high latitudes. Furthermore, variations in atmospheric CO2 appear to lag the rate of change of global ice volume. This implies only a secondary role for CO2 – variations in which produce a weaker radiative forcing than the orbitally-induced changes in summertime insolation – in driving changes in global ice volume.

Roe, G. (2006), In defense of Milankovitch, Geophys. Res. Lett., 33, L24703, doi:10.1029/2006GL027

The reason for my chuckles is that the “change in perspective” that Roe adopts was available more than 30 years earlier in the first formal verification of Milankovitch, which I published in Nature in 1974. Using a pocket calculator, I simply assumed that the rate of change in global ice volume per thousand years was proportional to the difference between the summer sunshine at a high-ish northerly latitude and a level of sunshine at which the ice neither advances or retreats.

Read the rest of this entry »


Fierce stellar black hole

07/07/2010

Pick of the pics

A fierce stellar black hole

To get this X-ray image, to be published in Nature tomorrow, NASA’s Chandra satellite stared at a galaxy 13 million light-years away in the Sculptor Constellation for a total of 14 hours. The tartan pattern of pixels is a symptom of the great distance. A stellar black hole, or microquasar, seen location-wise in blue (X-rays of 2-8 keV), is throwing out two huge jets of hot gas reaching to the yellow-red hot-spots (X-rays of lesser energy). The contour lines are for emissions from hydrogen atoms measured by the Cerro Tololo Inter-American Observatory. Other observations by the European Southern Observatory help to confirm that we’re seeing an exceptionally massive and greedy microquasar shedding much of its energy in the form of long jets of hot gas. From one jet end to the other is about 300 parsecs or 1000 light-years – roughly the distance from the Solar System to the bright stars of Orion.

Nigel Calder comments: Apologies for two brief “Pick of the pics” in a row. I’ve been busy with writing unrelated to this blog.

Reference

Manfred W. Pakull, Roberto Soria and Christian Motch, “A 300 parsec long jet-inflated bubble around a powerful microquasar in the galaxy NGC 7793”, Nature, 466, pp. 209–212, 8 July 2010. The text of the paper is available here: http://www.eso.org/public/archives/releases/sciencepapers/eso1028/eso1028.pdf


Follow

Get every new post delivered to your Inbox.

Join 149 other followers