Archive for the ‘Changin’ the world’ Category

The cosmic ray revolution

Wednesday, November 12th, 2008

Cosmic rays, the high energy protons and helium nuclei that constantly bombard the Earth, have puzzled astronomers for the best part of one hundred years. Where do they come from and how are they accelerated to energies in excess of 10^20 eV—that’s about the energy that Roger Federer gives a tennis ball during a serve? (By contrast, the  Large Hadron Collider will be able to accelerate protons to a mere  10^12 eV.)

To tackle these questions, astronomers have built a giant cosmic ray telescope about the size of Rhode Island in Argentina. It’s called the Pierre Auger telescope and in the short time it has been operating, it is already challenging astronomers’ views about the origin of cosmic rays. In particular, it’s beginning to look as if the highest energy comsic rays come from active galactic nuclei.

Serguei Vorobiov from University of Nova Gorica in Slovenia summarises the highlights. Worth a read if you want to get up to speed on a new generation of astronomy.

Ref: arxiv.org/abs/0811.0752: The Pierre Auger Observatory — a New Stage in the Study of the Ultra-High Energy Cosmic Rays

Saturn’s anomalous orbit flummoxes astronomers

Friday, November 7th, 2008

saturn

One of the first tests of Einstein’s theory of general relativity was to explain the precession of the perihelion of Mercury, which had long bamboozled astronomers. Newton’s law of gravity simply cannot account for it. But relativity does.

Now it’s Saturn’s turn to flummox astrophysicists. The Russian astronomer Elean Pitjeva, who heads the Laboratory of Ephemeris Astronomy at the Institute of Applied Astronomy in St Petersburg, has analysed a huge data set of planetary observations dating back to 1913, including 3D observations of the Cassini spacecraft now orbiting Saturn.

She says that the precession of Saturn’s perihileon, as predicted by general relativity, needs to be corrected to fit the data. The correction is tiny: -0.006 arcseconds per century.

That’s an astonishing claim but perhaps not surprising given the growing body of evidence that some kind of correction to gravity is needed to explain various puzzling phenomena such as the Pioneer and Flyby anomalies.

Obviously Pitjeva’s work needs to be independently verified but already the astronomy-mill is hard at work guessing what might cause the deviation from Einsteinian physics.

It’s possible that known physics will do the trick: for example, our knowledge of trans-neptunian objects may have enough uncertainty to allow for this kind of correction.

Lorenzo Iorio at the National Institute of Nuclear Physics in Pisa Italy, outlines various explanations of known physics:

Our knowledge of trans-neptunian objects may have enough uncertainty to allow for this kind of correction but this turns out to generate a prograde precession no the retrograde precession found by Pitjeva

The Lense-Thirring effect generates a force that is four orders of magnitude too small to account for the difference

Mutual cancellations among unmodelled or mismodelled effects may have conspired to cause the effect but Iorio says this looks exceedingly unlikely

Neither do various exotic modifications of gravity or the DGP braneworld model explain the figures, says Iorio

So what’s left? A magnificent conundrum for astronomers to puzzle over until they get better data and/or a new theory of gravity that explains all.

Ref: arxiv.org/abs/0811.0756: On the Recently Determined Anomalous Perihelion Precession of Saturn

Here come the quantum robots

Wednesday, October 22nd, 2008

quantum-robots.jpg

Quantum robots were first investigated in the late 1990s by Paul Benioff, a remarkably original thinker at Argonne National Laboratory in Illinois.

Benioff is currently occupied in holding a candle for a theory of everything based on quantum numbers (more or less single handedly).

So a team of Chinese physicists led by Daoyi Dong at the University of Science and Technology of China in Hefei , China, has taken up the challenge to develop our ideas about quantum robotics a little further.

Benioff’s work explored the way in which a quantum robot might explore a 2D or 3D space using the laws of quantum mechanics to speed up the search. If memory serves, there is a decent speed up in two dimensions but not in three (which has interesting implications for molecular building machines). But he gave no thought to the internal structure of his robots or how they might be constructed.

The Chinese team have  now given form to this structure. Quantum robots, they say, will consist of three parts:

i. an information processor consisting of one or several quantum computers

ii. some kind of quantum actuator that interacts with the environment to carry out a task

iii. a quantum sensor which monitors the environment, such as a SQUID  (superconducting  quantum interference device)  which detects magnetic fields.

The team has mysteriously omitted a quantum communication module to send and receive data from their classical masters.

So what can a quantum robot do that, say, a classical robot attached to quantum sensors cannot?

That’s not entirely clear. Daoyi and co  say that most planning and control problems in robotics can be posed as search problems. So Grover’s search algorithm gives a significant speed up in the time it takes to solve these problems. But presumably the same would be true of a classical robot controlled by a quantum computer.

Where they might prosper is in size. Presumably quantum robots will operate at a scale that is not accessible to classical robots. And this raises the prospect of a world beneath our own populated by quantum machines operating on entirely different principles to ours. All this needs some fleshing out.

The China team’s vision is far more sanguine, having dreamt up the following predictable  applications. They say:
“Quantum robots have many potentially important applications in military affairs, national defense, aviation and spaceflight, biomedicine, scientific research, safety engineering and other daily life tasks.”

(It must have taken them weeks to think up that list.)

Now all they have to do is build one of these guys.

Ref: arxiv.org/abs/0810.3283: Quantum robot: structure, algorithms and applications

And the number of intelligent civilisations in our galaxy is…

Monday, October 20th, 2008

31573.52

No really. At least according to Duncan Forgan at the Institute for Astronomy at the University of Edinburgh.

The Drake equation famously calculates the number of advanced civilisations that should populate our galaxy right now. The result is hugely sensitive to the assumptions you make about factors such as the number of planets that orbit a host star that are potentially habitable, how many of these actually develop life and what fraction of that goes onto become intelligent etc.

Disagreement (ie general ignorance) over these numbers leads to estimates of the number intelligent civilisations in our galaxy that range from 10^-5 to 10^6.  In other words, your best bet is to pick a number, double it….

So Forgan has attempted to inject a little more precision into the calculation. His idea is to actually simulate many times over, the number of civilisations that may have appeared in a galaxy like ours using reasonable, modern estimates for the values in the Drake equation.

With these statistics you can calculate an average value and a standard deviation for the number of advanced civilisations in our galaxy.

Better still, it allows you to compare the results of different models of civilisation creation.

Horgan has clearly had some fun comparing three models:

i. panspermia: if life forms on one planet, it can spread to others in a system

ii. the rare-life hypothesis: Earth-like planets are rare but life progresses pretty well on them when they occur

iii.  the tortoise and hare hypothesis: Earth-like plants are common but the steps towards civilisation are hard

And the results are:

i. panspermia predicts  37964.97 advanced civilisations in our galaxy with a standard deviation of 20.

ii. the rare life hypothesis predicts 361.2 advanced civilisations with an SD of 2

iii. the tortoise and hare hypothesis predicts 31573.52 with an SD of 20.

Those are fantastically precise numbers. But before you start broadcasting to your newfound friends with a flashlight, it’s worth considering their accuracy.

The results of simulations like this are no better than than the assumptions you make in developing them. And these, of course, are based on our manifestly imperfect but rapidly improving knowledge of the heavens.

The real question is whether we’ll ever have good enough data to plug in to a model like this to give us a decent answer, without actually discovering another intelligent civilisation. And the answer to that is almost certainly not.

Ref: http://arxiv.org/abs/0810.2222: A Numerical Testbed for Hypotheses of Extraterrestrial Life and Intelligence

Entangled photons to produce better quantum images

Thursday, October 16th, 2008

A while back, we saw how quantum imaging had been put on a firmer theoretical footing, thanks to some new thinking by Seth Lloyd at MIT.

Quantum imaging involves sending one of a pair of entangled photons towards an object while holding on to the other.

For a long while nobody was quite sure what benefit you might get from this entanglement. Some physicists speculated that it could be possible to produce reflection-free images by measuring the entangled twin that you hang on to, even if the other photon never returns.

What Lloyd calculated was that illuminating an object with entangled photons can increase the signal to noise ratio of the reflected signal by a factor of 2^e, where e is the number of bits of entanglement. That’s an exponential improvement.

Now he and a few pals have filled in a few details in the scheme that make it more realistic. done the experiment and shown that Lloyd was right on the money. They sent photons towards an object and used the reflection to determine whether the object was present or absent.

When they used entangled photons, this process was much more efficient.

The result is effectively the first quantum image taken with entangled photons .

Now all we’re waiting for is experimental proof of the scheme which, if I’m not mistaken, won’t be long in coming. The work was part funded by DARPA’s Quantum Sensor Program so it’ll be interesting to see what plans the organisation has for this technique.

Ref: arxiv.org/abs/0810.0534: Quantum illumination with Gaussian states

How to test the many worlds interpretation of quantum mechanics

Tuesday, October 7th, 2008

mwi1.jpg
The many worlds interpretation of quantum mechanics holds that before a measurement is made, identical copies of the observer exist in parallel universes and that all possible results of a measurement actually take place in these universes.

Until now there has been no way to distinguish between this and the Born interpretation. This holds that each outcome of a measurement has a specific probability and that, while an ensemble of measurements will match that distribution, there is no way to determine the outcome of specific measurement.

Now Frank Tipler, a physicist at Tulane University in New Orleans says he has hit upon a way in which these interpretations must produce different experimental results.

His idea is to measure how quickly individual photons hitting a screen build into a pattern. According to the many worlds interpretation, this pattern should build more quickly, says Tipler.

And he points out that an experiment to test this idea would be easy to perform. Simply send photons through a double slit, onto a screen and measure where each one hits. Once the experiment is over, a simple mathematical test of the data tells you how quickly the pattern formed.

This experiment is almost trivial so we should find out pretty quickly which interpretation of quantum mechanics Tipler’s test tells us is right.

Then it boils down to whether you believe his reasoning.

(And not everybody does. When Tipler published his book The Physics of Immortality one reviewer described it as ” a masterpiece of pseudoscience”.)

Let’s hope this paper is received a little more positively than his books.

Ref: arxiv.org/abs/0809.4422: Testing Many-Worlds Quantum Theory By Measuring Pattern Convergence Rates

How alien Earths will reveal their secrets

Friday, October 3rd, 2008

The European Space Agency has set itself an ambitious goal: to recognise the biomarkers on Earth-like planets orbiting other stars.

The first step in such an endeavour is work out to look for, which the goal that Lisa Kaltenegger at the Harvard-Smithsonian Center for Astrophysics in Cambridge  and Franck Selsis at the Laboratoire d’Astrophysique de Bordeaux in France have set themselves.

“The spectrum of the planet can contain signatures of atmospheric species that are important for habitability, like CO2 and H2O, or resulting from biological activity (O3, CH4, and N2O),” they say.  “The presence or absence of these spectral features will indicate similarities or differences with the atmospheres of terrestrial planets.”

But  just how similar is a question of some controversy. In one of the most fascinating papers of all time, Carl Sagan and friends analyzed a spectrum of the Earth taken by the Galileo probe, searching for signatures of life. They concluded that the large amount of O2 and the simultaneous presence of CH4 traces are strongly suggestive of biology.

But a more detailed study of this parameter space is necessary and not just from a theoretical point of view, conclude Kaltenegger and Selsis.

And time is short. With over 300 giant exoplanets already detected it is only a matter of time, maybe only months, before astrobiologists will have their first alien test case to analyse.

Ref: arxiv.org/abs/0809.4042: Atmospheric Modeling: Setting Biomarkers in Context

Periodic Pioneer anomaly points to modified general relativity

Friday, September 26th, 2008

periodic-pioneer-anomaly.jpg

The Pioneer anomaly grows ever more fascinating.

Here’s the background: Pioneer 10 and 11 were launched in 1972 and 1973 respectively and, after sweeping past a number of the outer gas giants, have been heading out of the solar system ever since.

NASA has been accurately tracking their position and speed using Doppler tracking measurements of radio signals from the craft. But this data has thrown up a problem. Both probes appear to be decelerating faster than can be explained by the Sun’s gravity. All that has been widely discussed and numerous explanations have been put forward to explain this discrepancy.

What isn’t so well known is that there is a periodic component to the anomaly. The team at the NASA’s Jet Propulsion Lab who have been collecting the data say that it’s unlikely that this variation is a from the spacecraft. Instead, they think probably the result of something at our end such as a tiny variation in Earth’s orbit.

Now Bruno Christophe and pals from the French aerospace lab, ONERA, near Paris and various other French institutions, have carried the most detailed analysis yet on these periodic variations and raise another interesting possibility.

A number of people have suggested modifications to general relativity that would explain the Pioneer anomaly. But there has never been a way to test these modifications.

Now Christophe and co say that the periodic variations are compatible with the effects on radio signals that some of these modifications might cause.

That’s an extraordinary claim. Obviously, more analysis is needed and it always pays to be cautious with these kinds of ideas. But could it really be possible that the Pioneer anomaly is the first evidence of physics beyond Einstein’s version of general relativity?

Ref: arxiv.org/abs/0809.2682: Pioneer Doppler Data Analysis: Study of Periodic Anomalies

How supermassive black holes help galaxies evolve

Friday, September 12th, 2008

black-hole-nasa.jpg

It’s easy to imagine that our understanding of the way galaxies form and evolve is more or less complete. After all, we’ve been fitting missing pieces into the jigsaw at an alarming rate in recent years with all this data from WMAP etc about the structure of the early universe, a better understanding of the distribution of dark matter and the vast computer simulations that show how galaxies should appear out of this maelstrom.

But there are one or two hairs in this astrophysical ointment. For example, our models of galaxy formation indicate that certain types of galaxies should become surrounded by huge clouds of gas in which stars ought to be forming. But observations show that there are far fewer of these types of galaxies than the models predict.

Today, Timothy Heckman of Johns Hopkins University in Baltimore discusses the idea that supermassive black holes at the center of these galaxies might explain the difference. The thinking is that black holes generate and spread enough energy to the outer reaches of the galaxy to regulate star formation in a way that fits with observations. We’ve certainly seen good evidence of supermassive black holes in various galaxies, including our own.

But what makes Heckman’s discussion highly provocative is the suggestions that a symbiotic relationship exists between galaxies and supermassive black holes, that they need each other to form. So supermassive black holes are as important in galactic evolution as gas, dust and gravity. What an idea!

What that means is that far from being a done deal, galaxy formation is set to become one of the hottest topics in astronomy as data from the next generation of space telescopes comes flooding in.

PS: Heckman has a great name for the study of gas-star-black hole cosmic ecosystems. He calls it gastrophysics. Like it!

Ref: arxiv.org/abs/0809.1101: The Co-Evolution of Galaxies and Black Holes: Current Status and Future Prospects

Supernova over south pole caused Ordovician mass extinction

Thursday, September 11th, 2008

ordovician-extinction.jpg

About 444 million years ago, more than half of all marine invertebrates were wiped out at the end of the Ordovician era in the third worst mass extinction in history.

A couple of years ago, Brian Thomas at the University of Kansas pointed out that this holocaust could have been caused by a nearby supernova zapping the Earth with gamma rays. A 10 second burst of gamma rays, they said, would have done for about half Earth’s ozone layer, leaving life here more or less unprotected from the Sun’s harmful UV rays for 10 years or more.

Organisms living deep beneath the waves would, of course, have been protected from UV rays anyway but those nearer the surface would have been wiped out within that time. What makes Thomas’ idea interesting is that the geological record seems to indicate that species living nearer the surface were hardest hit in the Ordovician extinction.

Today, Thomas and a pal give us an update based on more detailed simulations. They say the geological data is consistent with a gamma ray burst somewhere over the South Pole.

And this also allows them to predict that any large land mass well above the equator would have been shielded from the burst and so the geological record there ought to be different. Thomas suggests that northern China would be as good a place as any to start looking.

Better get digging.

Ref: arxiv.org/abs/0809.0899: Late Ordovician Geographic Patterns of Extinction Compared with Simulations of Astrophysical Ionizing Radiation Damage