Emotion Recognition Is China’s New Surveillance Craze

China is in the process of introducing emotion recognition technology to countrywide monitoring. (Image: fotologic via flickr CC BY 2.0)

As if the Chinese were not satisfied with the current level of surveillance, the CCP is in the process of introducing emotion recognition to countrywide monitoring. As the name suggests, the tech will use millions of cameras spread throughout the nation to scan faces of people and determine how they feel. Given China’s history of using tech for repressing people, emotion recognition will likely turn out to be another oppressive system for the public.

Emotion recognition

“Using video footage, emotion recognition technology can rapidly identify criminal suspects by analysing their mental state… to prevent illegal acts including terrorism and smuggling… We’ve already started using it,” Li Xiaoyu, a policing expert and Party cadre from the public security bureau in Altay city in Xinjiang, said to Financial Times (The Straits Times).

The tech is largely deployed by Xinjiang’s customs inspectors and is capable of identifying a person’s stress levels, nervousness, aggressiveness, and their potential for committing violence against others. The Xinjiang region has been a center of international scrutiny due to China’s inhumane treatment of Uyghur minorities. As such, it is possible that the state might eventually use the technology to arrest or detain Uyghurs whom the system classifies as a threat.

The tech is largely deployed by Xinjiang’s customs inspectors. (Image: Screenshot / YouTube)

Emotion Recognition will soon be rolled out at airports and subway stations. However, the large-scale rollout of the tech is not likely for at least three to five years as it still needs to mature. For now, experts consider emotion recognition software largely to be a gimmick. In a study published by academic journal Sensors, it was noted that Facial Expression Recognition (FER) was a challenging thing to do accurately.

“While FER systems can achieve accuracy rates of up to 97 percent in the lab, that rate drops to around 50 percent in real-world apps. Things like variances in lighting and the position of the head can throw off the system’s performance,” according to Disruptive Asia. There is also the fact that expressions shown by a person may not correspond to what they are actually feeling. For instance, people might cry when they feel joyful. An emotion recognition AI might just see the tears and classify the emotion as sadness when in reality it is not.

China has set up one of the most extensive public surveillance systems anywhere in the world. A report by Comparitech found that 8 of the 10 most surveilled cities are in China. At the top spot was the Chinese city of Chongqing, which has almost 168 cameras for every 1,000 people.

Changing behavior

A recent study by Axios and SurveyMonkey shows that people tend to change their behavior when they realize that they are under surveillance. The conclusion came after an online poll of 3,454 employed adults that found “62 percent of people… said it is appropriate for an employer to routinely monitor employees using technology… 48 percent said they would change their behavior if they knew their employer was monitoring them,” according to Axios.

A recent study shows that people tend to change their behavior when they realize that they are under surveillance. (Image: via pixabay / CC0 1.0)

In some situations, surveillance was useful in keeping track of employees and improving productivity. However, surveillance mostly creates a negative environment in the workplace that can eventually affect trust among employees. Some may try to avoid the supervision, which will trigger the management to install even tighter monitoring systems.

It will definitely be interesting to see how Beijing’s widespread network changes the behavior of the Chinese people. With every emotion being classified and monitored with the intention of identifying threats, it is possible that the citizens might eventually turn paranoid.

From Vision Times



Facebook Comments