Archive for the ‘Fightin’’ Category

Calculating the cost of dirty bombs

Wednesday, February 25th, 2009

cesium-137

One of the more frightening scenarios that civil defence teams worry about is the possibility that a bomb contaminated with radioactive material would be detonated in a heavily populated area.

Various research teams have considered this problem and come to similar conclusions–that the actual threat to human health from such a device is low. Some even claim that terror groups must have come to a similar conclusion which is why we’ve not been on the receiving end of such an attack. The panic such a device would cause is another questions.

Today Theodore Liolios from a private institution called the  Hellenic Arms Control Center in Thessaloniki in Greece, goes through the figures.

He says the most likely material to be used in such an attack is Cesium-137, widely used throughout the developed and developing world as a source for medical therapies.

The unstated implication is that it would be relatively easy to get hold of this stuff from a poorly guarded hospital. Exactly this happened in Goiania in Brazil when an abandoned hospital was broken into and its supply of cesium-137 distributed around the surrounding neighbourhoods. The incident left 200 people contaminated. Four of them died.

But a dirty bomb would not be nearly as lethal. The trouble with them (from a terrorist’s point of view, at least) is that distributing radioactive material over a large area dramatically reduces the exposure that people receive. Particularly when most could be warned to stay indoors or be evacuated (unlike the Goiania incident in which most people were unaware they were contaminated).

Liolios calculates that anybody within a 300 metre range of a dirty bomb would increase their lifetime risk of cancer mortality by about 1.5 per cent. And then only if they were unable to take shelter or leave the area. That’s about 280 people given the kind of densities you expect in metropolitan areas.

And he goes on to say that it is reasonable to assume that a cesium-137 dirty bomb would not increase the cancer mortality risk for the entire city by a statistically significant amount.

But the terror such a device might cause is another question. Liolios reckons that current radiation safety standards would mean the evacuation of some 78 square kilometres around ground zero. That would affect some 78,000 people, cost $7.8m per day to evacuate and some $78m to decontaminate.

That seems a tad conservative but however it is calculated, it may turn out to be chickenfeed compared to the chaos caused by panic, which may well result in more deaths than the bomb itself could claim. How to calculate the effect of that?

Ref: arxiv.org/abs/0902.3789: The Effects of Using Cesium-137 Teletherapy Sources as a Radiological Weapon

The power laws behind terrorist attacks

Friday, February 6th, 2009

power-law

Plot the number of people killed in terrorists attacks around the world since 1968 against the frequency with which such attacks occur and you’ll get a power law distribution, that’s a fancy way of saying a straight line when both axis have logarithmic scales.

The question, of course, is why? Why not a normal distribution, in which there would be many orders of magnitude fewer extreme events?

Aaron Clauset and Frederik Wiegel have built a model that might explain why. The model makes five simple assumptions about the way terrorist groups grow and fall apart and how often they carry out major attacks. And here’s the strange thing: this model almost exactly reproduces the distribution of terrorists attacks we see in the real world.

These assumptions are things like: terrorist groups grow by accretion (absorbing other groups) and fall apart by disintegrating into individuals. They must also be able to recruit from a more or less unlimited supply of willing terrorists within the population.

Being able to reproduce the observed distribution of attacks with such a simple set of rules is an impressive feat. But it also suggests some strategies that might prevent such attacks or drastically reduce them in number . One obvious strategy is to reduce the number of recruits within a population, perhaps by reducing real and perceived inequalities across societies.

Easier said than done, of course. But analyses like these should help to put the thinking behind such ideas on a logical footing.

Ref: arxiv.org/abs/0902.0724: A Generalized Fission-Fusion Model for the Frequency of Severe Terrorist Attacks

Massive miscalculation makes LHC safety assurances invalid

Wednesday, January 28th, 2009

lhc-risk

It just gets worse for CERN and its attempts to reassure us that the Large Hadron Collider won’t make mincemeat of the planet.

It’s beginning to look as if a massive miscalculation in the safety reckonings means that CERN scientists cannot offer any assurances about the work they’re doing.

In a truly frightening study, Toby Ord and pals at the University of Oxford say that “while the arguments for the safety of the LHC are commendable for their thoroughness, they are not infallible.”

When physicists give a risk assessment, their figure is only correct if their argument is valid. So an important questions is then: what are the chances that the reasoning is flawed?

(more…)

Reinventing the dismal science

Tuesday, January 20th, 2009

dismal-science

The discipline of economics in crisis. The credit crunch has exposed many economists’ most cherished ideas for the nonsense they manifestly are. With its theories in tatters, what now for the dismal science?

It looks as if the best bet is take a a few leaves out of some network science text books. Economies are complex networks, after all, although economists have failed to notice.

Until now! One of the first onto the network bandwagon is Nobel prize-winner Jo Stiglitz from Columbia University and a few pals who have been examining the Japanese credit network between banks and quoted firms in 2004.

Since the collapse of the bubble economy in the early 90s, the Japanese banking system has been going through a similar crisis to the one sweeping the west, so there is plenty to learn (not least of which is the time it can take to turn things around, although most western economists are ignoring this at present) .

The paper makes both heartening and frightening reading. (more…)

Why astronomical units need to be redefined

Thursday, December 18th, 2008

In 1983, the International Bureau of Weights and Measures defined the metre as the distance travelled by light in a vacuum in 1⁄299,792,458 of a second. That makes a metre a fixed unit of length.

For astronomers, however, distance is rather more malleable. In astronomy, distance is measured in astronomical units. Astronomers think of an au as the distance between the Earth and the Sun.
But since the Earth’s orbit is elliptical, this distance isn’t constant. At one time an au was defined as the length of the semi-major axis of the Earth’s orbit. But that isn’t constant either.

So in 1976, the Bureau International des Poids et Mesures in France defined the au to be the distance from the centre of the Sun at which a particle of negligible mass would orbit in 365.2568983 days.

The trouble with this definition is that it depends both on the mass of the Sun and the gravitational constant G.

That’s not good say, Nicole Capitaine and Bernard Guinot at the Observatoire de Paris. The gravitational constant and the mass of the sun can only be measured with limited accuracy. And who’s to say their value isn’t changing anyway? It makes no sense to define a fundamental unit of length in these terms.

They say astronomers desperately need a new definition and point out there is an obvious choice: define an au as some suitable multiple of a metre.

That would bring astronomy into line with SI units and making various calculations much more straightforward.

So what are they waiting for?

Ref: arxiv.org/abs/0812.2970: The Astronomical Units

Steganophony–when internet telephony meets steganography

Friday, November 28th, 2008

steganophony.jpg

Steganophony is the term coined by Wojciech Mazurczyk and Józef Lubacz at the Warsaw University of Technology in Poland to describe the practice of hiding messages in internet telephony traffic (presumably the word is an amalgamation of the terms steganography and telephony).

The growing interest in this area is fueled by the fear that terrorist groups may be able to use services such as Skype to send messages secretly by embedding them in the data stream of internet telephony. At least that’s what Mazurczyk and Lubacz tell us.

The pair has developed a method for doing exactly that called Lost Audio PaCKets Steganography or LACKS and outline it on the arXiv today.

LACKS exploits a feature of internet telephony systems: they ignore data packets that are delayed by more than a certain time. LACKS plucks data packets out of the stream, changes the information they contain and then sends them on after a suitable delay. An ordinary receiver simply ignores these packets if they arrive after a certain time but the intended receiver collates them and extracts the information they contain.

That makes LACKS rather tricky to detect since dropped packets are a natural phenomenon of the internet traffic.

But is this really an area driven by the threat of terrorism? If anybody really wants to keep messages secret then there are plenty of easier ways to do it, such as Pretty Good Privacy.

There’s a far more powerful driver for this kind of work. It’s name? Paranoia

Ref: arxiv.org/abs/0811.4138: LACK – a VoIP Steganographic Method

How much force does it take to stab somebody to death?

Wednesday, November 26th, 2008

knife.jpg

How much force does it take to stab somebody to death? Strangely enough, forensic scientists do not know.

A number of groups have attempted to measure the forces necessary to penetrate skin but the results are difficult to apply to murder cases because of the sheer range of factors at work. The type and sharpness of the knife; the angle and speed at which it strikes; the strength of skin which varies with the age of the victim and the area of body involved; these are just a few of parameters that need to be taken into account.

So when giving evidence, forensic scientists have to resort to relative assessments of force.

“A mild level of force would typically be associated with penetration of skin and soft tissue whereas moderate force would be required to penetrate cartilage or rib bone. Severe force, on the other hand, would be typical of a knife impacting dense bone such as spine and sustaining visible damage to the blade,” says Michael Gilchrist at University College Dublin and pals who are hoping to change this state of affairs.

They’ve developed a machine that measures the force required to penetrate skin–either the animal kind or an artificial human skin made of polyurethane, foam and soap.

The surprise they’ve found is that the same knives from the same manufacturer can differ wildly in sharpness. And the force required for these knives to penetrate the skin can differ by more than 100 per cent.

That could have a significant bearing in some murder cases. And that’s important because in many European countries such as the UK, stabbing is the most common form of homicide.

Gilchrist and co say their work could even help tease apart what has happened in that most common of defences: “he ran onto the knife, your honour”.

The key thing here is the speed and angle of penetration. The angle can be measured easily enough but the speed is another matter altogether. Gilchrist and co say future work may throw some light on this.

Ref: arxiv.org/abs/0811.3955: Mechanics of Stabbing: Biaxial Measurement of Knife Stab Penetration of Skin Simulant

The exoplanet photo gallery is bigger than you think

Monday, November 17th, 2008

fomalhaut

Astronomers tend to get excited by pinpricks of light. And perhaps today they have more reason than usual to celebrate the pixels that Paul Kalas at the University of California, Berkeley, and pals have found in one of the Hubble Space Telescope’s images.

These pixels, they say, represent the first optical image of a planet orbiting another star. The star in question is Fomalhaut in the southern constellation of Piscis Austrinus and one of the brightest in the sky.

Kalas and co say the planet is about three times the mass of Jupiter orbiting at a rather distant 119 AU. By comparison, Neptune orbits at around 30 AU so this is going to be one cold body.

That’s impressive work that has had significant press coverage but let’s put it in perspective.

Last year, the infrared Spitzer Space Telescope photographed HD 189733b, a Jupiter-sized gaseous planet orbiting a yellow dwarf in the constellation of Vulpecula. It even produced a heat map of the surface showing, unsurprisingly, that the planet is warmer at the equator than at the poles. But the map of HD 189733b got almost no coverage. And images of various “hot Jupiters” have been around for perhaps a decade or so.
I guess Hubble just has a better PR team.

Ref: arxiv.org/abs/0811.1994: Optical Images of an Exosolar Planet 25 Light Years from Earth∗

The terrible truth about economics

Friday, October 31st, 2008

“Compared to physics, it seems fair to say that the quantitative success of the economic sciences is disappointing,” begins Jean-Philippe Bouchaud,  an econophysicist at Capital Fund  Management in Paris. That’s something of an understatement given the current global financial crisis.

Economic sciences have a poor record of success, partly because they are hard (Newton once pointed out that modelling the madness of people is more difficult than the motion of planets). But also because economists have singularly failed to apply the basic process of science to their discipline.

By that I mean the careful collection and analysis of observable evidence which allows the development of hypotheses to explain how things work.

This is a process that has worked well for the physical sciences. Physicists go to great lengths to break hypotheses and replace them with better models.

Economics (and many other social sciences) works back to front. It is common to find economists collecting data to back up an hypothesis while ignoring data that contradicts it.

Bouchaud gives several examples.  The notion that a free market works with perfect efficiency is clearly untenable. He says: “Free markets are wild markets. It is foolish to believe that the market can impose its own self-discipline.” And yet economists do believe that.

The Black-Scholes model for pricing options assumes that price changes have a Gaussian distribution. In other words, the model and the economists who developed it, assume that the probability of extreme events is negligible.  We’re all now able to reconsider that assumption at our leisure.

Bouchaud could also have added the example of economists’ assumption that sustained and unlimited economic growth is possible on a planet with limited resources.   It’s hard to imagine greater folly.

So what is to be done? Bouchaud suggests building better models with more realistic assumptions;  stronger regulation; proper testing of financial products under extreme conditions; and “a complete change in the mindset of people working in economics”.

All these things seem like good ideas.

But Bouchaud seems blind also to the greatest folly, which would be to imply that the roller coaster ride that we have seen in recent weeks can somehow be avoided in these kinds of complex systems.

Various physicists have  shown that stock markets demonstrate the same kind of self-organised criticality as avalanches, earthquakes, population figures, fashions, forest fires…. The list is endless.

And of course, nobody expects to be able to prevent the spread of bell bottoms or earthquakes or avalanches. If you have forests, you’re going to have forest fires.

What people do expect, however, is to have mitigation procedures in place for when these disasters do happen. That’s where econophysicists need to focus next.

Ref: arxiv.org/abs/0810.5306: Economics Needs a Scientific Revolution

Frustration with fluid dynamics

Thursday, October 23rd, 2008

There is no shortage of fascinating videos for the Gallery of Fluid Motion at the upcoming meeting of the American Physical Society Fluid Dynamics division.

At least they sound interesting. We’ll never know because they’re practically impossible to download from eCommons library at Cornell University. That’s not good enough.

Surely YouTube (or one of its cousins) would be a better way to display these videos quickly, easily and above all reliably.

By all means place a hi-res version on the eCommons library for whoever has the patience to download it, but spare a thought for the rest of us. Spread the lurv, is all I’m saying. There’s plenty to go round.

Here’s a list of just a few of the interesting-sounding videos on the arXiv. If you’re thinking of downloading any, good luck.

Dynamics of Water Entry

The Bounce-Splash of a Viscoelastic Drop

Tornadoes in a Microchannel

Liquid acrobatics

The Clapping Book

Atoms in the Surf

Update: Turns out the videos are already on YouTube (see comments). Doh! (Might have been useful to mention this in the papers.)

Thanks Sheila!