Jens Garbas, head of the intelligent-systems lab, with video images where his scared facial expression are being analysed by software that can identify 100 emotions. The system was developed by the Fraunhofer Institute for Integrated Circuits in Erlangen, Germany.

What most regular-ability people can see at a glance can be much harder for people with autism: is that person looking happy or sad, annoyed or surprised? Most people on the autism spectrum have to learn over time what a smile or a wrinkled forehead means.
Researchers in the German city of Erlangen are working on software that they hope will make everyday life easier for autists.  
Their Shore software (the name stands for “sophisticated high-speed object recognition engine”) is built into a pair of glasses and can show the wearer whether the person they are speaking to is a man or a woman, and their age, more or less.
It can also recognise basic emotions.
A broad smile will produce the word “happy” and a red bar on the screen.
If a person’s mouth forms a round “O” shape and their eyes open wide, the screen shows the word “surprised.”
It doesn’t work perfectly, but Shore and similar programmes can usually get the basic emotions — anger, sadness, fear, happiness, surprise and disgust — right.
“Studies have shown that these are shown correctly in around 90 per cent of cases,” says Berlin psychologist and emotions researcher Isabel Dziobek. That’s astonishing, she adds, given that human faces can express around 100 emotions.
Jens Garbas and his team at Erlangen’s intelligent-systems lab of the Fraunhofer Institute for Integrated Circuits have been working for around 10 years on the software. The principle behind it is simply one of automated learning.
The researchers have fed the programme with more than 30,000 specimen images and a description of whether the faces are of men or women, the position of the mouth, nose and eyes and what kind of emotion they are expressing.
Consumer researchers are already using the software for advertising and market research. For example, they can use it to find out how people react on the spot to an advert, and what they view positively or negatively.
Cameras with similar software are on the market already: they refuse to take photos unless the subject smiles.
The technology is also being used in shops to find out whether shoppers are men or women, their ages, what time of day they come in and how they move around the shop.
There are even digital adverts which can react to viewers, asking for example, “You look so sad. Shall I tell you a joke?”
The researchers have lots more ideas. “The software could be used anywhere where people interact with machines,” says Garbas.
The goal is for the software to recognise ever more emotions, and to detect whether they are real or faked. For example, a real smile is not just made with the mouth, but activates more muscles and causes fine lines around the eyes.
The Fraunhofer researchers’ latest idea is to build the software into glasses for people with autism. Friedrich Nolte of the German Society for the Advancement of People with Autism (Bundesverband zur Förderung von Menschen mit Autismus) thinks the idea could work.
It could for example be helpful for people on the autism spectrum meeting new people for the first time.
“When you don’t know people, it’s more difficult to read their emotions,” says Nolte.
But ultimately, he says, it makes more sense to teach people with autism how to read emotions for themselves, rather than with the help of software.
“It’s always better to learn a language yourself, rather than having an interpreter standing next to you,” he explains using a simile.
Isabel Dziobek is working on just that objective, but with the help of the software.
The technology can be used to recognise 40 emotions, created by putting the upper and lower half of a face together.
The emotions have been recorded by 70 actors in almost 8,000 video and audio sequences.
The idea is that the programme would not just be used by people with autism, but also people in professions in which the ability to recognise emotions is especially important, for example police officers, customs officers and care workers.
Garbas rejects the idea that the technology could encroach on privacy.
“All of the evaluations take place within the device and the information is stored without any link to the person,” he says. —DPA



Related Story