Emotion recognition tech fundamentally flawed, says human rights charity

By on 12/02/2021
Article 19’s report says the assumption that emotions can be unearthed from facial expressions has been discredited. Credit: teguhjatipras/5 images/pixabay

Emerging AI-based emotion recognition technologies are “untenable” under international human rights law, and developing, designing or sharing such tech should be banned, a recent report has argued. 

The study, which focused on the growing use of emotion recognition technologies by state and corporate bodies in China, was published in January this year by Article 19, a British human rights organisation focusing on freedom of expression and information.

The report noted: “Even as some stakeholders claim that this technology can get better with time, given the pseudoscientific and racist foundations of emotion recognition on one hand, and fundamental incompatibility with human rights on the other, the design, development, deployment, sale, and transfer of these technologies must be banned.”

Biometric technologies, such as facial recognition, are commonly used for identification purposes. But emotion recognition goes further, using machine learning to infer a person’s emotional state from analysis of facial expressions.

China is leading in developing and deploying such systems, according to the report. The tech is already being used in areas such as law enforcement to identify “suspicious” individuals, it says. Schools are also deploying it to monitor pupils’ attentiveness, and companies to determine access to credit.

Indeed, the emotion recognition technology market is set to expand over the next few years, research suggests. Recent analysis predicts that the global market for the technology will grow from US$5 billion in 2015 to US$65 billion in 2024.

Rights and science

There are two main concerns about emotion recognition technology in the report: human rights, and the science behind the tech.

The technology is built around three “scientific assumptions” that have been discredited, the report noted: that “facial expressions are universal; that emotional states can be unearthed from them; and that such inferences are reliable enough to be used to make decisions.”

Unregulated proliferation of emotion recognition technologies would have grave consequences for human rights and freedom of expression, the authors warned. Concerns about the scientific basis for the technology are “further exacerbated by how it is used to surveil, monitor, control access to opportunities, and impose power,” the report said.

This could also lead to unjust treatment, according to Article 19. “By analysing and classifying human beings into arbitrary categories that touch upon the most personal aspects of their being, emotion recognition technology could restrict access to services and opportunities,” according to the organisation’s website.

Time to act

One of the report’s authors, Article 19’s senior programme officer Vidushi Marda, told Politico’s AI: Decoded column that there were no ethical uses of emotion recognition tech.

“It’s fundamentally inconsistent with values that underpin any kind of human rights framework, especially when you think about dignity, privacy, right against self-incrimination and even the freedom of opinion,” she said.

Action is needed fast, the report concluded, before the market grows to the point where providers can influence future regulation in their favour.

Marda told Politico: “You can’t have meaningful conversations about how to mitigate the dangers of these technologies and how to avoid these dangers if you’re going to let the companies call the shots at the end of it.”

Recent years have seen the launch of a number of global initiatives to introduce standards for the use of AI, with actors including Canada, Australia, the US, the UK, the OECD and the EU moving to address the risks of unethical or improper use of the emerging technology.

About Ben Willis

Ben Willis is a journalist and editor with a varied background reporting on topics including public policy, the environment, renewable energy and international development. His work has appeared in a variety of national newspapers including the Guardian, Daily Telegraph and Times, as well as numerous specialist business, policy and consumer publications.

Leave a Reply

Your email address will not be published. Required fields are marked *