Archive for the ‘Nets ‘n’ webs’ Category

VoIP threatened by steganographic attack

Friday, May 30th, 2008

VoIP steganography

Steganography is the art of hiding message when they are sent, in a process akin to camouflage. In cryptography, on the other hand, no attempt is made to hide the message, only to conceal its content.

Today, Wojciech Mazurczyk and Krzysztof Szczypiorski of the Warsaw University of Technology in Poland explain how VoIP services are wide open to steganographic attack and even measure how much information can be sent covertly in this way.

VoIP services such as Skype are vulnerable to steganographic attack because they use such a high bandwidth and that makes it relatively easy to embed a hidden message in the bit stream in a way that it is almost impossible to detect.

For precisely this reason, the US Department of Defence specifies in that any covert channel with a bandwidth higher than 100 bps must be considered insecure for average security requirements. For high security requirements, the DoD says the data rate should not exceed 1 bps, making it next to impossible to embed a hidden code without it being noticed.

So VoIP systems such as Skype, with their much higher data rates, are difficult to secure.

And to prove it, Mazurczyk and Szczypiorski have tested a number of steganographic attacks (including two new ones they’ve developed themselves) on a VoIP system to determine how much data could be sent. They say that during an average call (that’s 13 minutes long according to Skype) they were able to covertly transmit as much as 1.3 Mbits of data.

That should get a number of governments, companies and individuals thinking. How secure is your VoIP system?

Ref: arxiv.org/abs/0805.2938: Steganography of VoIP streams

World’s oldest social network reconstructed from medieval land records

Tuesday, May 13th, 2008

Medieval network

The network of links between peasants who farmed a region of small region of south west France called Lot between 1260 and 1340 have been reconstructed by Nathalie Villa from the Universite de Perpignan in France et amis.

The team took their data from agricultural records that have been preserved from that time. This is a valuable dataset because it records the date, the type of transaction and the peasants involved.

Villa and co used this to recreate the network of links that existed between individuals and families in th 13th and 14th centures in this part of France. They then drew up a self organising map of the network (see above).

But the best is surely to come. What Vilal hasn’t yet done is analyse the network’s properties. Does this medieval network differ in any important ways from the kind of networks we see between individuals in the 21st century? If so, what explains the differences and if not what are the invariants that link our world with 13th century France. The team promises an analysis in the near future.

In the meantime, it’s worth reflecting on the significance of this work. These kinds of networks could provide anthropolgists with an exciting new way to study historical societies.

And while this may be the world’s oldest social network (if anyone knows of an older network, let us know), it’s unlikely to remain so for long. Excellent records survive of transactions in ancient Rome, from the earlier Greek empire and even from the Egyptian civilizations that built the pyramids some 4000 years ago.

If Villa work turns up any useful insights into the nature of medieval society in France, you can be sure that anthroplogists will rush to repeat the method usnig data from from even older societies.

All that’s left is to christen the new science of the study ancient social networks Any suggestions?

Ref: arxiv.org/abs/0805.1374: Mining a Medieval Social Network by Kernel SOM and Related Methods

The mathematics of tackling tax evasion

Friday, May 9th, 2008

Tax evasion

In recent years, economists have gained the luxury of actually being able to test their ideas in experiments involving the behaviour of real people. And one particularly new and promising area of experimental economics focuses on tax evasion, which ought to be of keen interest to many governments around the world.

A couple of years ago, Simon Gachter at the University of Nottingham carried out a number of experiments on the way people co-operate which had profound implications for tax evasion. Gachter’s conclusion was that people decide whether or not to pay taxes based on the behaviour of their peers. The implication is that in certain circumstances, tax evasion may be a kind of fashion that spreads through society like bell-bottomed jeans.

Today, Georg Zaklan from the University of Bamberg in Bavaria, Germany, and pals show just how this might work in the real world by constructing a model of tax evasion behaviour in society.

His society is an Ising spin model (most commonly used to show critical behaviour in magnetic materials) in which agents can chose to evade taxes or not based on the behaviour of their neighbours.

Sure enough, the model shows that without any control on tax evasion, the behaviour can spread rapidly, disappear equally quickly and re-appear again later (just like bell-bottoms).

But the beauty of Zaklan’s simulation is that it suggests a way in which governments can very easily prevent the spread of tax evasion. The team has modelled the effect of increasing the probability that a tax evader will be caught and show that a small increase could have profound effects on tax evasion.

So what governments should do is increase the number of tax audits they carry out (as well as making sure there are adequate punishments for offenders). Zaklan says the model implies that if only 1 % of the population is tax audited, tax evaders would be brought to heel for good.

That sounds interesting and might be worth a try in some countries, were it not for some important gaps in the paper.

The biggest of these is this: what evidence is there that tax evasion fluctuates in the real world in the way that the Ising model predicts? Zaklan doesn’t present any, so while this work is interesting, I’ll need some better evidence before I’m convinced that his model really describes what’s going on.

Ref: arxiv.org/abs/0805.0998: Controlling tax evasion fluctuations

How many politicians spoil the broth? More than 20…

Thursday, April 17th, 2008

Cabinet size

The Scottish author Robert Louis Stevenson once said: “politics is perhaps the only profession for which no preparation is thought necessary.”

Given that these people run the world’s biggest (and smallest) economies, how many are needed to do a decent job?

It is well known in management circles that decision making becomes difficult in groups of more than 20 or so. The British historian Northcote Parkinson studied this idea in relation to British politics and conjectured that a cabinet loses political grip as soon as its membership passes a critical size of 19-22 due to its inability to make efficient decisions.

Now Peter Klimek and pals from the Complex Systems Research Group at the Medical University of Vienna in Austria have found a similar relationship between the efficacy of political systems around the world and the size of the cabinets they employ to make decisions.

Using data supplied by the CIA (which must obviously be 100 per cent correct), they compared the cabinet sizes in 197 self-governing countries with various indicators related to those countries’ economic, social and democratic performance. For example, the UN’s Human Delevopment Indicator which assesses a country’s achievement in areas such as GDP, life expectancy at birth and literacy.

The size of cabinets varied from just 5 in Liechtenstein and Monaco to 54 in Sri Lanka.

Klimek and co say that the various indicators of success are negatively correlated with cabinet size. Their message is, rather predictably, that too many cooks spoil the broth.

More interesting is their claim that there is a critical value of around 19-20 members, beyond which consensus is more difficult to achieve. They build a (somewhat unconvincing) mathematical model to show that at this critical value “dissensus” becomes more likely because it is easier to form multiple opposing factions in groups of more than 20 members. However, the transition from consensophile to dissensophile groups doesn’t look very critical to me.

All this is of more than passing relevance in Europe where the recent expansion of the European Union has resulted in a club of 27 nations. How will effective decision making be made? By reducing the size of the cabinet, called the European Commission, to 18 members, with various countries coming in and out on a rotation basis.

That means a third of the members will not be represented at the exective level. Which is praisworthy for its practicality but dubious from a democratic point of view. But that’s politics.

Ref: arxiv.org/abs/0804.2202: To How Many Politicians should Government be Left?

Criticality and the brain

Tuesday, April 8th, 2008

Brain connections

Our understanding of how various parts of brain function is advancing at breakneck speed and yet we are as far away as ever from an overarching “theory of the brain” that attempts to encompass these discoveries. Such a theory would unite disparate discoveries in brain science under a unifying theme.

Now Dante Chialvo from Northwestern University in Chicago and colleagues attempt to do just that. Their proposal is that the brain is spontaneously posed at the border of a second order phase transition, just like the transition a ferromagnetic material undergoes as it swtches from a non-magnetic to a magnetic phase.

One of the features of these transitions is the existence of a critical point in which both phases exist simultaneously in a way that ensures that the distinction between them more or less disappears. At this so called “criticality”, all kinds of curious phenonena have been found, including self organising behaviour.

Chialvo and buddies say “all human behaviors, including thoughts, undirected or goal oriented actions or any state of mind, are the outcome of a dynamical system at or near a critical state.”

They make a list of features that they would expect the brain to demonstrate in experiment were it operating close to criticality.

At large scales, they say, we should see cortical long range correlations in space and time as well as large scale anti-correlated cortical states. That certainly seems to be true of our brains in general.

And at small scale, we should see “Neuronal avalanches”, as the normal homeostatic state for most neocortical circuits. And sure enough, the group point to evidence for this.

The trouble is that these look very much like an after-the-fact- predictions in this paper, a feeling that is backed up by the absence of any testable hypothesis about the brain.

If the brain is close to crticiallity (which doesn’t seem like too far fetched an idea), surely it would be possible to make some predictions about the results of experiments such as those involving human attention, optical illusions and the reaction to various stimuli.

So while Chialvo’s proposal may make the pretense of being a theory of the brain, to my mind they’ll have to settle for the status of “interesting idea” until somebody takes them significantly further.

Ref: arxiv.org/abs/0804.0032: The Brain: What is Critical about It?

The coming blackout

Wednesday, April 2nd, 2008

On Monday, 17th December 2007, Europe narrowly avoided disaster. A cold snap had lowered the temperature across much of continent to several degrees below average and that evening, as households across the continent switched on their heating systems, the power consumption hit critical levels.

France, Italy and Spain all set new records for power consumption. By sheer luck, Switzerland and Germany, which were less cold, were able to provide some 1.6 GWe of spare capacity to cover the cracks in the system.

As it turned out, the rest of the winter was abnormally mild. But had the cold snap been more widespread, the European electricity supply could have collapsed.

The problem dates from about 30 years ago when Europe’s grid system and generating capacity was built with a huge amount of spare capacity. Since then, as economies have boomed, politicians have had little incentive to upgrade the system. In the meantime, consumption has been increasing at the rate of 1-2 per cent per year and today the spare capacity has all but gone. With the simplest extrapolation being that demand will continue to grow at the same rate, a crisis looms.

Now the Union for the Co-ordination of Transmission of Electricity, an association of power providers in Europe has issued a report detailing the system’s shortcomings. And analysis on the arXiv by Michael Dittmar at the Swiss Federal Institue of Technology in Zurich paints an even gloomier picture, not least because there is no clear short term path to reducing consumption or increasing generating capacity.

Europe has suffered a number of large blackouts in recent years, notably in Italy between 28-29th September 2003 and in France and Germany on 4 November 2006. But worse looks to be on the cards. Dittmar’s message is that the next winter of 2008/9 will test the European grid to its limits.

Ref: arxiv.org/abs/0803.4421: The European Electricity Grid System and Winter Peak Load Stress

Proof that a minority of streets handle the majority of traffic

Wednesday, March 12th, 2008

Gavle

In recent years, physicists have turned their penetrating gaze towards the structure of towns and cities. What they tend to do is measure the “connectedness” of a town by looking at how many roads each street is connected to. It turns out, that cities follow an 80/20 rule, that 80 percent of the streets have a below average connectedness while 20 per cent have an above average connectedness.

This is no surprise since the same kind of 80/20 pattern crops up with alarming regularity in all kinds of networks, particularly social ones. (The most famous is Pareto’s law which states that 80 per cent of the wealth is owned by 20 per cent of the people).

But so what? Pawing over maps and sweating over street names maybe a theoretical physicist’s idea of fun but nobody has actually proved that the 80/20 rule has any tangible effect on street use.

Now Bin Jiang at the Hong Kong Polytechnic University has come up with some actual data from a real town. He says that 80 percent of the traffic in a Swedish town called Gavle flows along 20 per cent of the streets. And 1 per cent of the most highly connected steets account for a phenomenal 20 per cent of the flow. What’s more, he says the flow is intimately linked to the topology of Gavle (a town of 70,00 people).

So there you have it. Although it seems only common sense to imagine that the most traffic flows along the best connected streets, we now have some evidence to prove it. Good, solid, unspectacular physics.
Ref: arxiv.org/abs/0802.1284: Street Hierarchies: A Minority of Streets Account for a Majority of Traffic Flow

Can data overload protect our privacy?

Monday, March 10th, 2008

Messenger

If you were chatting on MSN messenger in June 2006, your conversation was being recorded and the details (but not the content) passed to Eric Horvitz and Jure Leskovec at Microsoft Research in Redmond, Washington. Using this data, these scientists have created “the largest social network constructed and analyzed to date”.

They’ve now published their results which show the habits of people who use Messenger and the scale on which it occurs. But this study is noteworthy for another reason: it gives a curious insight into the limitations of this kind of analysis. The Microsoft team says it had too much data and this affected its ability to crunch it effectively.

Here’s what they did. The researchers used data such as IP address and log in and out times as well self-reported information such as age, sex, and zip code (which are obviously highly accurate) to carry out their analysis.

The bald details are that 30 billion IM conversations took place between 180 million people all over the world in June 2006.

The researchers found that people tend to chat to individuals who share the same language, age group and geographical location (in other worlds to people like themselves). They also chat more often and for longer with members of the opposite sex.

Each account had on average 50 buddies and, in the IM world, people are separated by “7 degrees of separation”.

That’s about the strength of it and I’m underwhelmed. No fascinating insights into the correlation between chatting spikes and news broadcasts/ad breaks/episodes of Friends; or the patterns of chat in the workplace versus home using IP location changes; or how IM users travel the world. Just straightforward count ’em ‘n’ weep numbers.

But there’s a good reason for the lack of more detailed insight. The problem, say Horvitz and Leskovec, is the size of the data base: 4.5 terabytes which took 12 hours to copy to a dedicated eight-processor server. “The sheer size of the data limits the kinds of analyses one can perform,” they say.

So will data overload always protect us from Big Brother’s prying eyes? Perhaps in some circumstances like these but otherwise I wouldn’t count on it. It’s straightforward to sample big datasets like this (although that can introduce problems of its own).

I wouldn’t mind betting that with a little more effort, it would be possible to identify individuals from their travel and chatting patterns, perhaps by correlating the data with local telephone and business directories much in the same way this has been done with search data. However, it looks as if Horvitz and Leskovec have steered carefully around this issue.

Of course, Microsoft doesn’t need to do this since it can store a much fuller set of data anyway including the full text of the conversations and whatever data it has on the identity of the owners.

And you can be sure that more shadowy organisations with access to much greater computing resources will also have this full data set and be happily chewing through it as you read this.

Ref: arxiv.org/abs/0803.0939: Planetary-Scale Views on an Instant-Messaging Network

Food for thought

Tuesday, March 4th, 2008

Food for thought

Evolution seems to crop up all over the place. In life, business, ideas. And now in recipes through the ages.

Yup, that’s recipes. For food. Osame Kinouchi from the Universidade de São Paulo in Brazil and buddies, have studied the way in which ingredients used in recipes vary around the world and through the ages. And they’ve found, they say, evidence of evolution.

The team studied the relationship betwen recipes and ingredients in four cookbooks: three editions of the Brazilian Dona Benta (1946, 1969 and 2004), the French Larousse Gastronomique, the British New Penguin Cookery Book, and the medieval Pleyn Delit.

They took the recipes from each book, counted the number of times each ingredient appeared in these recipes and ranked them according to frequency.

What’s remarkable is that the frequency-rank distribution they found is more or less the same for each cookbook. Kinouchi and co say this can be explained if recipes evolve in much the way that living organisms do–in a landscape in which some ingredients can be thought of as fitter than others, in which random mutations take place, and some ingredients die out while others prosper.

Very clever…unless they’ve missed something.

Perhaps it’s not ingredients that produce this distribution but words themselves. I’d be interested to see whether the results they get would be significantly diffierent were they to examine the frequency of adjectives or colours or numbers in these books rather than ingredients. If not, then recipes have nothing to do with the results they are presenting.

Of course, it’s possible that recipes have evolved in the way the group suggests. But the evidence they present here doesn’t look convicing to me.

Ref: arxiv.org/abs/0802.4393: The Nonequilibrium Nature of Culinary Evolution

Why silos burst

Thursday, January 31st, 2008

Force chain

Believe it or not, grain silos are interesting structures. They’ve been known to explode without warning, which is hard to explain since they are filled with, well, grain.

But grain turns out to be kinda interesting too. In recent years, researchers have begun to get a handle on some of the strange and counterintuitive ways in which grain behaves as it flows and as it is placed under pressure.

One of the most interesting developments has been the discovery of “force chains”, networks of particles that form as the force is passed from one grain to the next (see picture). In this way, forces of many orders of magnitude greater than expected can be transmitted through the medium.

John Wambaugh and colleagues at Duke University in Durham have been studying the force networks that are set up within a two-dimensional silo and how these can make the forces behave in an extraordinary, non-linear way.

When grain is added to the top of the silo, the pressure in the medium increases but goes on increasing in a non-linear way even after the addition of material has stopped before decaying, a so-called “giant overshoot” effect.

How to explain this? Usually, force chains break and reform as the pressure changes in a granular medium and this helps to spread the forces evenly within it.

But Wambaugh thinks the non-linear behaviour suggests that something else is going on. He says that in certain circumstances, the force chains become locked in place and so that the additional pressure spreads much further and deeper than usual, creating the giant overshoot.

It might also explain why silos sometimes burst unexpectedly.

Ref: arxiv.org/abs/0801.3387: Force Networks and Elasticity in Granular Silos