Astronauts training to work on the International Space Station have to have mastered a mind-boggling amount of kit before they leave Earth. One of these devices is the Canadarm 2, a robotic arm used to manipulate experiments outside the station.
On Earth, astronauts train on a Canadarm 2 simulator connected to a virtual assistant that can spot potential errors, such as a move likely to smash the arm into the station. The assistant then offers hints and tips to the astronaut to help him or her make a correction or even issue a command to prevent damage.
The bad news for astronauts is that André Mayers and colleagues at Université de Sherbrooke in Canada who created the virtual assistant, have given it an unusual upgrade. In an attempt to help the simulator learn more about the astronauts who are using it, the team has programmed the assistant to experience the equivalent of an emotion when it records a memory of what has happened.
The problem is that the assistant receives a huge amount of data from each training session. It’s emotional response allows it to determine which of this data is most important, just as humans do. “This allows the agent to improve its behavior by remembering previously selected behaviors which are influenced by its emotional mechanism,” say the team.
The system is called the Conscious Tutoring System or CTS. It’s not clear from the paper how well the system works but how long before one unlucky astronaut hears the phrase: “I’m sorry Dave, I can’t do that.”
Ref: arxiv.org/abs/0901.4963: How Emotional Mechanism Helps Episodic Learning in a Cognitive Agent