This time last year, researchers at Bielefeld University in Germany uploaded a six-legged robot, nicknamed Hector, with a newly developed artificial intelligence software that gave it a simple form of consciousness, allowing the machine to learn to navigate an obstacle course.
Since then, the research team has been hard at work refining its AI technology, upgrading the software architecture to mimic the brain’s neural network, which is a significant step towards making Hector self-aware. The team is close to testing Hector’s software in advance of physical trials with the robot.
As Dr. Holk Cruse, Professor at the Cluster of Excellence Cognitive Interaction Technology (CITEC) at Bielefeld University, explains, “What works in the computer simulation must then, in a second phase, be transferred over to the robot and tested on it.” CITEC are looking for “emergent” abilities within the software. In other words, learned behaviours that the AI is not programmed to perform. Cruse and his colleague, Dr. Malte Schilling, seem optimistic that Hector will demonstrate such traits.
“Emotions can be read from behaviour. For example, a person who is happy takes more risks and makes decisions faster than someone who is anxious,” Cruse says, expounding that, “Depending on its inner mental state, the system may adopt quick, but risky solutions, and at other times, it may take its time to search for a safer solution.”
“With the new software, Hector could observe its inner mental state – to a certain extent, its moods – and direct its actions using this information,” say Schilling. “What makes this unique, however, is that with our software expansion, the basic faculties are prepared so that Hector may also be able to assess the mental state of others. It may be able to sense other people’s intentions or expectations and act accordingly.”
Cruse adds, “the robot may then be able to ‘think’: what does this subject expect from me? And then it can orient its actions accordingly.”
Thank you Hearst Electronic Products for providing us with this information.