Archive for the ‘The good ol’ days’ Category

10 years of the Physics arXiv Blog: 2007

Friday, August 11th, 2017

The Physics arXiv Blog is 10 years old today. Over the next few days, we’ll be celebrating by publishing links to the top stories from each year of its existence.

Today, 2007. Enjoy!

The incredible galactic foxtrot

Invasion of the jivin’ nanoshrooms

Breaking the Netflix prize dataset

Why Galileo underestimated the distance to the stars

Thursday, August 28th, 2008

airy-disc.jpg

“Galileo argued that with a good telescope one could measure the angular sizes of stars, and that the stars typically measured a few arc-seconds in diameter,” says Chris Graney at Jefferson Community College in Louisvile in good ol’ Kentucky.

That doesn’t sound right.  We know today that stars appear as point sources of light, so what was Galileo talking about?

Graney says that it looks increasingly likely that Galileo had developed an ingenious technique for measuring the angular size of distant objects. This allowed him, for example, to see that Jupiter’s apparent diameter became smaller as the distance between Earth and Jupiter increased.

So how did Galileo measure the angular size of stars?

The answer according to Graney is that Galileo must have been unknowingly measuring the diffraction pattern created by stars, the so-called airy disc that makes them seem to have an apparent diameter.

This was how he arrived at absurdly close estimates for their distance. For example, he thought that the brightest stars were about 360 astronomical units away, about an order of magnitude nearer than we think today.

Makes sense, I suppose. And since the wave mechanics necessary to understand diffraction wasn’t developed for two centuries after Galileo’s death, perhaps we should forgive him this one mistake.

Ref: arxiv.org/abs/0808.3411: Objects in the Telescope are Further Than They Appear

Deconstructing DiMaggio’s 56-game hitting streak

Monday, August 4th, 2008

hitting-streaks

“The incredible record of Joe DiMaggio in the summer of 1941 is unparalleled. No one has come close—before or since—to equaling his streak of hitting safely in 56 games in a row.”

So begin Steve Strogatz and Sam Arbesman from Cornell University in their paper discussing the likelihood of DiMaggio’s record.

“People have…stated that it is the only record in baseball (or perhaps even in all of sports) that never should have happened, statistically speaking: while other records can be explained by expected outliers over the long and varied history of professional baseball (nearly 150 years), DiMaggio’s record stands alone”

But as with so many statistical assumptions, a proper analysis can reveal counterintuitive results, say Strogatz and Arbesman. The pair have modelled the phenomenon of hitting streaks using a number of simple models and guess what…DiMaggio’s record is not as unexpected as it looks.

The models suggest that while a DiMaggio-like record is unlikely in any given year, it is not unlikely to have occurred about once within the history of baseball.

But having plugged the statistical performance of a number of players into the model, DiMaggio is not the most likely to have picked up such a record. That honour goes to one of Ross Barnes, Willie Keeler or Hugh Duffy (there is no single most likely player). DiMaggio, it turns out, is 47th most likely player to have reached the record in one of the models used.

More curious is why Strogatz, widely considered to be the father of the small world network theory, has taken up the baton in examining baseball statistics. He joins a small but select group of physicists and mathematicians with a passion for the game including Gene Stanley and Percy Diaconis.

So what’s next? Surely the task now is to find a record that defies statistics in the sense that it is truly unlikely. Let me be the first to suggest Don Bradman’s 99.94 batting average in test cricket.

Ref: arxiv.org/abs/0807.5082: A Monte Carlo Approach to Joe DiMaggio and Streaks in Baseball

World’s oldest social network reconstructed from medieval land records

Tuesday, May 13th, 2008

Medieval network

The network of links between peasants who farmed a region of small region of south west France called Lot between 1260 and 1340 have been reconstructed by Nathalie Villa from the Universite de Perpignan in France et amis.

The team took their data from agricultural records that have been preserved from that time. This is a valuable dataset because it records the date, the type of transaction and the peasants involved.

Villa and co used this to recreate the network of links that existed between individuals and families in th 13th and 14th centures in this part of France. They then drew up a self organising map of the network (see above).

But the best is surely to come. What Vilal hasn’t yet done is analyse the network’s properties. Does this medieval network differ in any important ways from the kind of networks we see between individuals in the 21st century? If so, what explains the differences and if not what are the invariants that link our world with 13th century France. The team promises an analysis in the near future.

In the meantime, it’s worth reflecting on the significance of this work. These kinds of networks could provide anthropolgists with an exciting new way to study historical societies.

And while this may be the world’s oldest social network (if anyone knows of an older network, let us know), it’s unlikely to remain so for long. Excellent records survive of transactions in ancient Rome, from the earlier Greek empire and even from the Egyptian civilizations that built the pyramids some 4000 years ago.

If Villa work turns up any useful insights into the nature of medieval society in France, you can be sure that anthroplogists will rush to repeat the method usnig data from from even older societies.

All that’s left is to christen the new science of the study ancient social networks Any suggestions?

Ref: arxiv.org/abs/0805.1374: Mining a Medieval Social Network by Kernel SOM and Related Methods

Statistical evidence of drug abuse in baseball

Thursday, April 3rd, 2008

Baseball stats

How many major league baseball players have used performance-enhancing drugs? The answer turns out to be buried in the performance statistics of players, if you know where to look.

Eugene Stanley and colleagues at Boston University have done the appropriate number crunching and say that a whopping 5 per cent of players must have been users, and that’s just a lower limit. The evidence comes from an analysis of home runs hit by players in the last 25 years.

Stanley’s analysis throws up some interesting takes on baseball. For example, it’s easy to imagine that pitchers and batters would both benefit from the increased strength and rapid recovery from injury that performance enhancing drugs allow. Not so, says Stanley:

We see evidence of competitive advantage mainly in the case of home runs. This indicates that the level of competition between pitcher and batter are tipping in the favor of the batter, possibly as a result of widespread performance-enhancing drug use.”


The statistics seem to back up the findings of former senator George Mitchell who published the results of a two year investigation into the widespread use of drugs among professional major league baseball players in December last year. The report listed 89 players who are alleged to have used steroids or other drugs.

Stanley ends by pointing out that drug use is by no means confined to baseball: “Performance-enhancing drugs are the core of a pandemic that not only poses personal health risk, but also places the integrity of sports in jeopardy.”

Ref: arxiv.org/abs/0804.0061: Statistical Evidence Consistent with Performance-enhancing Drugs in Professional Baseball

The curious case of the disappearing physicist

Thursday, March 27th, 2008

If you work in particle physics, cosmology or condensed matter, you’ll probably be familiar with the name Majorana, as in Majorana fermions and Majorana neutrinos.

But Ettore Majorana is famous for another reason. As one of the leading lights of theoretical physics in the 1930s, he made important contributions to nuclear, atomic and molecular physics as well as quantum electrodynamics and relativity. But on 26 March 1938, his career was cut short when he disappeared in mysterious circumstances in Naples and was never heard from again.

“On Friday March 25, 1938 Majorana went to the Institute of Physics and handed over the lecture notes and some other papers to one of his students. After that, he returned to his hotel and, after writing farewell letters to his family and to the director of the Institute of Physics, Carrelli, apparently embarked on a ship to Palermo. He reached his destination the following morning, where he lodged for a short time in the Grand Hotel Sole. It was there that he wrote a telegram and a letter to Carrelli pointing out a change of mind about his decisions. On Saturday evening Majorana embarked on a ship from Palermo to Naples. From here onwards, no other reliable information about him are available. “

So writes Salvatore Esposito from the University of Naple in Italy in a retrospective of Majorana’s life and work, posted on the arXiv on the 70th anniversary of his disappearance.

Nobody knows what happened to Marjorana but there is no shortage of theories including a retreat in a monastery, a flight to a foreign country and most likely suicide for which there is some circumstantial evidence. We will almost certainly never know for sure.

Good topic for a Hollywood movie, I’d say.

Ref: arxiv.org/abs/0803.3602: Ettore Majorana and his Heritage Seventy Years Later

How cleanliness can kill

Thursday, November 1st, 2007

The hygiene hypothesis is that our immune system requires the presence of pathogens to grow and function properly. The thinkin is that dirt ‘n’ muck provides a kinda training ground on which the immune system “learns” it’s trade when we’re all youngsters.

So mothers who keep a-scrubbin and a-cleanin them germs away are actually doin’ more harm than good. Their littluns’ immune systems ain’t never gonna learn how to fight off invaders.

Alotta medical bods think the hygiene hypothesis makes sense. They say it’s cleanliness that causes asthma and other allergies, not dirt. And the evidence is growing to back ’em up. But exactly how this balance between pathogens and our immune system works ain’t known.

Now Didier “See” Sornette at the Swiss Federal Institute of Technology in Zurich and a few buddies have built a mathematical model of the immune system based on this idea. Their assumption is that a balance exists between the immune system and the many pathogens that it comes across each day. (FYI our bodies are built outta 10^13 cells but house something like 10^14 bacteria.)

See Saw and his pals study two ways that this balance can change. The first is an external attack of pathogens such as a cholera epidemic or an infection following major surgery. Obviously that don’t do nobody no good.

But another type of change occurs when the immune system itself becomes weakened, perhaps by stress, lack of sleep or heavy boozin’. Then pathogens can spread even if the body ain’t exposed to an abnormal load.

See Saw’s work consists of exploring the topology of this mathematical model and findin’ areas of stability. The model predicts, for example, that a critically ill person can be made healthy by strengthening their immune system. Nothin’ strange about that. But it also predicts that ya can kill a critically ill person by reducing the load on their immune system. That’s when cleanliness kills.

So dirt might be even better than we thought. Not only does it train the immune system, but it can keep ya alive too. And if that’s the case, sterile hospitals could be as bad as dirty ones.

That’s gonna get people goin’ like a mongoose in a jockstrap. Ya’ll sit back and wait for the wailin’ and gnashin’ of teeth.

Ref: arxiv.org/abs/0710.3859 :Endogenous versus Exogenous Origins of Diseases

The mystery of the missing photons

Friday, October 5th, 2007

A few hundred thousand years after the big bang, the Universe was a-hummin’ and a-jigglin’ with a plasma of hydrogen and helium nuclei as well as electrons. As the universe cooled, the electrons combined with the nuclei to form neutral atoms, giving off photons in the process. These photons are what we see as the cosmic background radiation.

This so-called cosmic re-ionization was significant because it allowed light from the first quasars and galaxies to travel through the universe for the first time. (Prior to that the light was absorbed by the plasma.) And in 2001, astronomers at Caltech spotted relics from this period in the universe for the first time.

But Nickolay “Gnu” Gnedi at the Kavli Institute for Particle Physics at the University of Chicago says he’s found a problem. There ought to be just enough photons in the background radiation to account for all them atoms in the early universe, right? Well, Gnu has been a-countin’ and says there ain’t enough. He bases his argument on observational data and numerical simulations

The work raises an interestin’ possibility. If we can’t see enough photons from all the galaxies, quasars and dwarf galaxies we been lookin’ at, then there’s gotta be other things out there which we ain’t seein’. Either that or Gnu Gnedi has his numbers wrong.

Ref: arxiv.org/abs/0709.3308 : Are There Enough Ionizing Photons to Reionize the Universe by z=6?

Why the first stars could have been filaments

Saturday, September 29th, 2007

Star formation in the early universe ain’t particularly well understood. Why should a pristine cloud of dust suddenly collapse to form a star?

One theory is that clumps of dark matter create gravitational wells into which the dust clouds collapse (although why dark matter should be clumpy and not smooth is anybody’s guess).

But Liang “Ang Li” Gao at Durham University in the UK says that in some theories dark matter might equally be drawn into filaments. In which case the first stars would have formed as filaments rather than spheres. So they’d look like giant lightbulbs hanging in the cocktail lounge of space.

Ref: arxiv.org/abs/0709.2165: Lighting the Universe with Filaments

The missing language link

Monday, September 10th, 2007

The distribution of languages is the result of the movin’ and migratin’ of millions of people over tens of thousands of years. As a fossil of human history, it’s unrivalled in its richness.

So understandin’ this distribution is major task for them linguists and them historians who want to know more about our ancestors’ locomotin’ habits. One interestin’ feature is that the size of language families follows a power law, like many other natural and social phenomena. Whereas the size of languages by the number of speakers has a different distribution.

To understand why this might be, researchers have been a-buildin and a-tinkerin with computer models of language distribution in an attempt to match the observed data. This week, Paulo Murilo Castro de Oliveira aka “Mr Margarine” at the Ecole Superieure de Physique et de Chimie Industrielles in Paris and colleagues say they’ve cracked it.

Mr Margarine says the trick is to combine two computer models: one that simulates the migration of peoples and the propagation of languages and another that simulates the linguistic structure of languages and how they evolve.

The result is a model that accurately reproduces the observed distribution of languages and language families for the first time. This is a potential goldmine.

Mr Margarine is so confident in his model that he says it can be used to predict undiscovered features of language distribution.

That’s a big claim but one that rings hollow given that slippery My Margarine ain’t sayin’ what any of these might be.
Ref: arxiv.org/abs/0709.0868: A Computer Simulation of Language Families