Robots are developing emotions, thanks to engineers at Honda

Ask a native of Japan to describe the contents of a fishbowl and you’ll likely get a different answer from that of an American. For many years, psychologists have known that the language we speak and the culture we are raised in affect the way we see the world — what we notice, and perhaps just as saliently, what we fail to notice. So it’s not surprising Japanese computer scientists have taken a different tack with AI research than their Western counterparts. In particular, they have focused on the emotional or character components of robots, rather than strictly questions of efficiency and intelligence. Most recently, this has resulted in the creation of something called an “emotion engine” for AI, the result of a collaboration between Honda and Softbank.

While Japanese robots have lagged behind in some regards, in artificial emotions they are now showing signs of outdistancing the field. The reason may be simply asking seemingly nonsensical questions, like “if an autonomous car could feel, what emotion would it be feeling?” Though some might laugh, on an everyday basis humans are notorious for imputing emotions and motives where none exist. Think of the last time you saw someone curse at their car when it failed to start, or cajoling the car before giving the key another turn. Some neuroscientists have speculated this is due to an overactive social brain. For many millennia, human survival depended upon reading other people’s emotions. This led to the development of a hypersensitive emotion detection system, prone to seeing emotion and motive where none exists.

Artificial Intelligence AI

The success or failure of commercial AI might well depend on how robots and autonomous vehicles respond to people’s emotions. The emotion engine Honda is developing will serve that precise purpose. Using sensors and cameras, the AI will gauge the user’s emotional state and respond with an emotional gestalt of its own. Not surprisingly, the technology will first see use in a self-driving concept car Honda is developing called the NeuV (pictured, top). However, there is little reason to believe such a system would not find broader application in robots, like Pepper, that are specifically designed for human-robot interaction.

It’s worth asking whether the survival of our own emotional hardware might depend on such an emotion engine for machines. Just as humans were once highly dependent upon reading emotional cues for survival, the uptick in social media and computer-based interactions have changed the equations of survival and reproduction, with abstract reasoning ability now playing a more disproportionate role in mate selection. Some scientists have even speculated that diseases like autism, a condition resulting in something like “emotion blindness” could be the result of such changing environmental conditions.

“To control, in effect, is to be controlled: by driving the car properly, I enable it to play a safe and useful role in life,” Masahiro Mori, the leading Japanese roboticist, said. “But by controlling me, the automobile enables me to be a reliable and effective driver. The same relationship links human beings with all machines. They don’t do what you want them to do unless you do what they force you to do.”

By spending excess time operating machines that lack emotions, the cognitive components that govern our own emotions could begin to atrophy. Granting emotions to machines, therefore, may ultimately — and counter-intuitively — be more about preserving our own emotions rather than passing them onto robots.

Let’s block ads! (Why?)

ExtremeTechExtreme – ExtremeTech