samsung-memory-ngWKDQhM88Q-unsplash
Uncategorized

How Does AI Read Emotions?

12/03/2020

With progressive technology and the world becoming more virtual, it is feared that the world will lose human connection and interaction. However, have you thought of technology replacing those interactions? Experts and researchers have been developing artificial intelligence to create systems that think and behave like humans and detect & react to human emotions.

An emotional recognition software & analysis provider, Biosy Pitre, says emotion AI permits developers to learn facial patterns & understand emotions based on user reactions. This study aims to build personalized user experiences that can enhance lives. 

When Marvin Minskey and John MacCarthy discovered AI in 1956, they were astonished how a machine could perform incredibly tough puzzles faster than humans. But it turns out that teaching AI to win a chess match is somewhat easy. What would pose challenges would be teaching machines what emotions are and how to copy them?

Social & emotional intelligence come almost naturally to humans; we react impulsively. While some of us have more opinions than others, we can easily interpret the emotions & feelings of those around us. The base-level intelligence we were partially born with and learned tells us how to act in certain situations. So, can these understandings be taught to a machine?

Socially-intelligent Robots

Aniket Bera, an Assistant Research Professor at UMIACS, is currently working on exciting research, i.e., “socially-intelligent robots.” He forecasts that humans will soon be in proximity to independent robots in public places like their offices, homes, buildings, and sidewalks. These robots have been assigned to increasing activities, including warehousing applications and surveillance delivery. The global pandemic’s growing challenge is even encouraging healthcare facilities and hospitals to introduce growing numbers of independent robotic systems into supercritical healthcare operations. 

One of the fascinating research uses of socially-intelligent robots is their skills to read body language. The study into ascertaining emotion from facial expressions is somewhat built. 

Emotion AI

Emotion AI does not mean a weeping computer that has had a bad day. Emotion AI, also referred to as Affective computing, dates back to 1995 and refers to the branch of AI that aims to understand, comprehend, and even copy human emotions. The tech aims to enhance natural interactions between man and machine to build an AI that interacts more authentically. If AI can gain Emotional Intelligence (EI), perhaps it can also mimic those emotions. 

Picard and Rana el Kaliouby founded Affectiva in 2009, a Boston-based emotion AI company that expertise in automotive AI and marketing research. The user’s camera catches their reactions with their consent while watching an advertisement. Multimodal emotion AI evaluates a facial expression, body language, and speech; they can gain complete insight into the individual’s mood. 

Thanks to 90% accuracy levels of their varied test-sets of 6 million faces from 87 different nations used to train deep learning algorithms. The AI will learn which metrics of body language & speech patterns concur with other thoughts & emotions from a varied data set. 

Like humans, machines can generate more detailed insights into our emotions from video & speech than just text. Sentiment analysis, a sub-field of Natural Language Processing, is a process of computational recognition and classifying opinions expressed in the text to figure out the user’s attitude towards the matter. This use case can be implemented in various sectors like call centers, telemedicine, think tanks, sales & marketing to communicate to another level. 

Although AI might classify what we say into positive or negative boxes, does it actually understand how we feel or the sub-text beneath? As humans, we miss sarcasm, cultural references, and nuance in language that totally change the meaning and, thus, the emotions displayed. At times, we leave out such things and don’t say that can even imply how we are feeling. AI is not capable enough to understand this sub-text, and many doubt if it ever will. 

Can AI display emotion?

In several use cases like Telemedicine call center and chatbots virtual assistants, businesses are investigating the advancement of Emotion AI to understand clients’ emotions and enhance these platforms’ individual responses. The ability to simulate human-like emotions provides these platforms and services more authenticity. However, is this a genuine display of emotion?

AI & neuroscience researchers agree that existing AI forms cannot have their emotions but can replicate emotions like empathy. The synthetic speech even helps prevent the robotic-like tone several services operate with and emit more genuine emotions. Google’s Tacotron 2 is evolving in the field to simulate human-like artificial voices.

So, in several cases, if machines can understand how we feel and thus exhibit a helpful and caring response, are they emotionally intelligent? Much debate is going on within this field if a simulation of emotion shows the understanding or is still artificial. 

Functionalism argues that if we simulate EI, then, by definition AI, is EI. But professionals question if the machine genuinely understands the message they are delivering, and thus a simulation isn’t a reflection that the device is EI. 

Artificial General Intelligence (AGI) 

Building an AGI, which has an intense level of understanding, is what several experts believe a machine can one-day experience emotions as we do.

As opposed to Narrow Intelligence, AGI refers to the computer’s ability to conduct various tasks, like humans. Artificial Narrow Intelligence aims to complete individual tasks with a high level of efficiency and accuracy.

While talking about emotional & social intelligence, forms of intelligence that are not related to a set task or goal fall under AGI. AGI aims to mimic the qualities that seem automatic to us. They aren’t tied to an end goal; we do them just cuz’ we do. 

Conclusions

We are several years behind having an AGI able to mimic each action we can perform, mainly those qualities that we consider most human, like emotions. 

Emotions are naturally tough to read, and often there is a disconnect between what people say they feel and what they truly feel. A machine might never get to this level of understanding, but who is to say how we comprehend emotions is the only way. Our interpretation of each other’s emotions is full of bias and opinion, so perhaps AI can help us get to the point of our feelings.

Author Bio:

Harnil Oza is CEO of Hyperlink InfoSystem, one of the leading app development companies in New York and India, having a team of the best app developers who deliver the best mobile solutions mainly on Android and iOS platforms. He regularly contributes his knowledge on leading blogging sites.

Leave a Reply

Your email address will not be published. Required fields are marked *