Develop AI system capable of understanding the tone in a conversation
The same message can have different meanings or interpretations depending on the tone of the conversation, so understanding the tone that is used by our interlocutor is fundamental to avoid misunderstandings.
Until now, this was one of the main challenges of artificial intelligence (AI) systems. Now, a team of researchers has developed an artificial intelligence (AI) system that, combined with a wearable device, is able to understand the tone of a conversation.
Among its applications: serve as a coach for the user wearing the device or even help people with anxiety or with Asperger.
Artificial intelligence assistants who communicate with voice users, such as Siri, have improved dramatically over the past few years, coming to grips with great skill in a conversation with a human interlocutor.
However, there is a recurring problem: your inability to detect the tone or emotion associated with the message; Something fundamental to avoid misunderstandings.
Researchers at the MIT Laboratory of Computer Science and Artificial Intelligence (CSAIL) and the Institute of Medical Science and Engineering (IMES) have solved this problem in part.
At the moment, the system is still not very specific, but it is already able to classify the tone of the conversation as positive, neutral or sad, combining the information of certain speech patterns with the vital signs captured by the device.
When one of the participants in the conversation narrates a story, the system can analyze the person’s audio, text transcripts, and physiological cues to determine the overall tone of the story with up to 83% accuracy.
In addition, using deep learning techniques, the system is able to provide a “feeling score” for specific intervals of five seconds within a conversation.
“As far as we know, this is the first experiment that collects both physical and speech data, in a passive but robust way, even when subjects have natural, unstructured interactions,” says Mohammad Ghassemi, one of the authors of the study.
“Our results show that it is possible to classify the emotional tone of the conversations in real time.”
In the experiments, the researchers used intelligent clocks already on the market, namely the Samsung Simband.
This shows how close we could be to having such solutions on the market.
To protect the privacy of users and their personal information, the algorithm runs locally on the user’s device.
However, the researchers point out that, in the face of a commercial solution, it would be necessary to develop clear protocols to request the consent of all participants.
Researchers will present the results of their experiments at the Association for the Advancement of Artificial Intelligence (AAAI) conference this week in San Francisco.
The next step will be to try out conversations involving multiple people, each with their wristband device, in order to try to improve the performance of the algorithm.
The final objective is to make the system able to identify different tones in a much more specific way, in order to detect, for example, boredom, anger, nervousness, apathy …