Loading...

Machines can now spot diseases by analysing human emotions

Machines can now spot diseases by analysing human emotions
Photo Credit: Photo Credit: Pixabay
Loading...

Researchers at the Media Lab at the Massachusetts Institute of Technology (MIT) have developed a machine-learning model that allows computers to analyse human emotions, the university said in a blog post on its news portal.

The model developed by the researchers can capture subtle facial expressions to better gauge moods. It achieve this by being exposed to thousands of images of faces, the blog post stated.

Part of the field of affective computing, the technology can be used for a wide variety of purposes such as tracking health, determining how interested students are in classrooms, detecting diseases, and also creating robot companions, the blog post said.

Loading...

Though the model, which was presented last week at the Conference on Machine Learning and Data Mining, has numerous and beneficial applications, it still faces a major challenge: robots and machines are not yet sophisticated enough to capture the subtlest of human emotions, despite the emergence of deep learning technologies in recent years. It is easy for a human brain to identify and interpret emotions but machines find it difficult as they are not as accurate and adaptable across different populations, the blog post stated.

To overcome this challenge, the researchers at the Media Lab combined two techniques, which scan for minute facial expressions from images.

According to the blog post, in traditional affective-computing models, machines train on a single set of images of facial expressions. The Media Lab researchers used a technique called mixture of experts (MoE), which comprise a number of neural network models, and matched these experts to 18 individual video recordings.

Loading...

“This is an unobtrusive way to monitor our moods. If you want robots with social intelligence, you have to make them intelligently and naturally respond to our moods and emotions, more like humans,” said Oggi Rudovic, a Media Lab researcher and co-author on a paper describing the model.

According to Oggi Rudovic, this was the first time these two techniques had been combined for affective computing.

“In MoEs, a number of neural network models, called ‘experts’ are trained to specialise in a separate processing task and produce one output. We have also incorporated a ‘gating network’ which calculates probabilities of which expert will best detect moods of unseen subjects,” said Michael Feffer, co-author on a paper describing the model.

Loading...

Artificial intelligence is increasingly being used to analyse human behaviour.

Recently, researchers from University of Stuttgart in Germany and Flinders University in Australia developed a new artificial intelligence system to track a person's eye movements to identify their personality type.

In May, researchers at the University of California, Los Angeles (UCLA) developed an app-based AI engine called ChatterBaby to better understand how babies think and feel. The app can decode baby cries and can detect autism at an early stage.

Loading...

Sign up for Newsletter

Select your Newsletter frequency