|The increasing adoption of AI in most industries has become inevitable. The recent McKinsey Global Survey reported that there has been a 25% increase in the use of AI in standard business processes / Photo by: Syda Productions via Shutterstock|
The increasing adoption of AI in most industries has become inevitable. The recent McKinsey Global Survey reported that there has been a 25% increase in the use of AI in standard business processes. Most of the executives that have adopted AI have stated that it has increased their revenue while 44% stated that it has reduced costs.
About 58% of respondents in the survey stated their companies have embedded at least one AI capability – an increase from only 47% in 2018. There has also been a rise in the share of companies using AI in products or processes across several business units and functions. Thus, it’s forthcoming to see AI grow more and more. According to Forbes, a global media company focusing on business, investing, technology, entrepreneurship, leadership, and lifestyle, the AI market across the world is projected to reach $202.57 billion by 2026 – an increase from $20.67 billion in 2018.
However, as AI progresses and the world becomes more virtual, there is a fear that humans might lose connection and communication. For years, researchers and developers have been advancing AI not only to create tools and systems that think and act like humans but also to detect and react to human emotions.
AI in Reading Emotions
Researchers are developing AI systems that can precisely gauge our thoughts and feelings to make our daily lives easier. This can be used to learn and develop more personalized and convenient services and products to consumers. While many AI efforts are noteworthy, these barely scratch the surface of how AI could be used to understand people’s wants and needs.
Earlier this year, researchers from the University of Colorado and Duke University introduced EmoNet, a neural net that can classify images in 11 emotional categories. According to TB Tech, a UK based publication platform focusing on the latest news about AI, Internet of Things, Blockchain and Cryptocurrency, and more, the team classified 137,482 video frames from 2,187 videos, into 27 distinct emotion categories such as anxiety, surprise, and sadness. They validated their results with 25,000 images after the training. This aims to predict what people feel.
The researchers concluded that AI can accurately predict emotions like sexual desire, craving, or horror. However, it couldn’t predict awe or confusion with great precision. Meanwhile, emotions like joy, adoration, and amusement were confusing for the model because they had similar facial features. This AI application is extremely helpful for mental health purposes such as treatments, therapeutics, and interventions.
For instance, IoT devices powered by this AI can serve as a household robot or assistant. These could essentially pick up on people’s emotions to assist them. Also, people who suffer from anxiety and depression could use this as their mood diary. They could log their facial expression and let the AI work out how they are feeling. “Moving away from subjective labels such as ‘anxiety’ and ‘depression’ towards brain processes could lead to new targets for therapeutics, treatments, and interventions,” Philip Kragel, one of the researchers in the study, said.
Also, Microsoft has been using facial recognition technology that can track eight "core" states: anger, contempt, fear, disgust, happiness, sadness, surprise or neutral. The technology identifies several action units (AUs) or certain facial muscle movements people make to link with specific emotions. For instance, AI can conclude that a person is happy when both of their AU "cheek raiser" and AU "lip corner puller" are identified together.
However, Fujitsu, a Japanese multinational information technology equipment and services company, says that the current technology is not accurate because AI needs to be trained with huge datasets for each AU. And, researchers don’t have enough images for that. Thus, the company developed a way to help track emotions better. According to ZDNet, a business technology news website, Fujitsu created a tool that can extract more data out of one picture instead of creating more images to train AI. Compared to the 60% accuracy rate of Microsoft, Fujitsu had a detection accuracy rate of 81%.
"With the same limited dataset, we can better detect more AUs, even in pictures taken from an oblique angle, for example. And with more AUs, we can identify complex emotions, which are more subtle than the core expressions currently analyzed,” a spokesperson for Fujitsu said.
|Researchers are developing AI systems that can precisely gauge our thoughts and feelings to make our daily lives easier. This can be used to learn and develop more personalized and convenient services and products to consumers / Photo by: Selenophile via Shutterstock|
Can We Trust AI to Read Our Emotions?
While AI systems/tools that can read or detect human emotions can bring huge benefits in some industries, some experts are skeptical about this. Recently, the Association for Psychological Science titled “Emotional Expressions Reconsidered: Challenges to Inferring Emotion From Human Facial Movements” showed that companies who use AI to evaluate people’s emotions are misleading consumers.
According to The Verge, an American technology news and media network, each of the five distinguished scientists from different theoretical camps in the world of emotion science were asked to scrutinize the evidence. They concluded that human emotions are expressed in a huge variety of ways. Thus, it’s not reliable to lean on systems to detect how a person feels from only a simple set of facial movements.
“People, on average, the data show, scowl less than 30% of the time when they’re angry. So scowls are not the expression of anger; they’re an expression of anger — one among many. That means that more than 70% of the time, people do not scowl when they’re angry. And on top of that, they scowl often when they’re not angry,” Lisa Feldman Barrett, a professor of psychology at Northeastern University and one of the review’s five authors, said.
Nonetheless, AI systems/tools that can detect or understand emotions are expected to increase in the next coming years. These can be used for a variety of reasons, transforming the way we live yet again.
|While AI systems/tools that can read or detect human emotions can bring huge benefits in some industries, some experts are skeptical about this / Photo by: pathdoc via Shutterstock|