UK and Japan Will Examine Emotional AI in Smart Cities
Wed, April 21, 2021

UK and Japan Will Examine Emotional AI in Smart Cities

UK and Japan launched six projects to investigate the multiple, uncertain, and wide-ranging impacts of AI in our society / Credits: FeelGoodLuck via Shutterstock

 

Artificial intelligence is programmed to solve problems and improve current operations. AI tools are designed to do what they are tasked to do, thus, it has no business with other issues outside of their assignments. But, what if people teach AI to understand our feelings? In recent years, researchers have been experimenting and developing new ways to teach AI how to analyze human facial expressions and identify their emotions. Emotional AI is the researchers’ way of making algorithms more human.

To understand the use of Emotional AI, Japan and the UK collaborated with six projects which aim to boost understanding of how AI technologies affect people’s lives. The projects were launched to investigate the multiple, uncertain, and wide-ranging impacts of AI in our society. These would cover various topics including impacts of AI on individuals’ happiness and wellbeing; issues relating to transparency, responsibility, governance and ethics, and economic implications for skills, work, and education. 

According to Smart Cities World, an online site that provides a centralized source of intelligence about the infrastructure required to create a smart city today and for the future, the team assigned in these projects will interview organizations that are developing or deploying emotional AI in smart cities and examine existing governance for the collection and use of intimate data about people’s emotions, especially in public spaces.

Northumbria University in England will be focusing on the use of emotional AI in policing and security, which will explore both the benefits and potential issues. Experts believe that emotional AI can have the capacity to enhance safety, especially when it comes to preventing crime and improving security. However, Dr. Diana Miranda, lecturer in criminology at Northumbria University and a member of Northumbria’s Centre for Crime and Policing, stated that there are risks.

“We must understand the social and ethical implications that arise from the use of different techniques that allow our bodies, feelings, intentions and emotional states to be read by machines,” she said.