Sensor fusion is the tricky task of combining different data streams to give an output that is better than the sum of the parts. Let me tell ya, it ain’t easy. Too often, too much data just confuses things.
So hats off to Cherif “Very” Smaili at The French Institute for Research in Computer Science and Control in Nancy and his mates who have squeezed, merged and melded the data from several sources to improve the way autonomous navigation systems determine where they are.
Anybody who’s ever used GPS knows that at best, it provides intermittent data on position and direction. So Very Smaili has combined GPS information with data from the car’s odometer to give continuous navigation infomation.
If the GPS signal goes down, the onboard computer works out where it is on a digital map using the distance travelled by the car.
Of course, that still leaves some uncertainty should the car come to a junction – there ain’t no way of telling from distance measurements which way the vehicle has gone. So the onboard keeps every possibility open until it can unambiguously work out where it is.
That looks simple and useful, which ain’t a bad combination.
Ref: arxiv.org/abs/0709.1099: Multi-Sensor Fusion Method using Dynamic Bayesian Network for Precise Vehicle Localization and Road Matching
For me this seem like someone invented a wheel…again.
Proposed combination isn’t actully even clever. Existing system use odometer info from all the wheels of the car (ABS sensors) to determine where car is turning + info from gyro to make positioning even more accurate. Called as “Dead Reckoning”.
Didnt read the original paper thru. Maybe there is something you failed to mention?
Anyhows, I been enjoying your approach to science 🙂 Keep it coming!