fbpx

China Uses Emotion Recognition To Predict Crimes In Its New Surveillance Tech

Watch out China, emotion recognition is on its way if a new tech expo announcement is anything to go by.

The technology is being rolled out at airports and subway stations to identify criminal suspects and is the latest development in crime prediction systems.

Emotion recognition systems already exist in Xinjiang, a western China region where an estimated 1 million mainly-Muslin minorities are held in internment camps.

Last month, the United States blocked eight Chinese artificial intelligence tech companies from buying US-made products on the basis of their alleged human rights abuses in Xinjiang.

Financial Times correspondent, Sue-Lin Wong, revealed the news in a series of tweets having visited Shenzhen’s largest surveillance tech expo.

She wrote: “I visited China’s largest surveillance tech expo with @QianerLiu this week held once every two years in Shenzhen – ‘the world security capital.’ A thread & our story about China’s latest new surveillance craze: emotion recognition.”

A policing expert and party cadre from Xinjiang’s public security bureau told us they have started using emotion recognition to identify criminal suspects. He also said they work with major Chinese tech companies like Alibaba, Tencent, Hikvision and Dahua.

Li Xiaoyu, a policing expert from the public security bureau in Altay city in Xinjiang said: “Using video footage, emotion recognition technology can rapidly identify criminal suspects by analysing their mental state . . . to prevent illegal acts including terrorism and smuggling. We’ve already started using it.”

Companies such as Amazon, Microsoft, and Google are all developing emotion recognition. Some scientists, however, have questioned the likelihood of such technology.

Ge Jia, a Beijing-based blogger wrote: “This technology is still a bit of a gimmick and is unlikely to be rolled out on a large scale in the next 3-5 years.”

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More