Currently, a lot is being written about Chinese IT companies. They are the subject of suspicion of our intelligence services too, in implementing 5G technologies. Even the latest findings won’t help the damaged reputation of Huawei company.
The Chinese IT giant has cooperated with security suppliers to develop surveillance products, some of which may serve to identify ethnicity.
For example, they tried software that could recognize the faces of Uighur minorities and automatically alert the police to their presence.
The company is also suspected of providing some of these surveillance products together with partners to state authorities in the north-western Xinjiang region. For decades, the Chinese Communist Party has sought to control and assimilate the Uighurs, a Turkic ethnic minority, in this region.
Independent experts from the IPVM research organisation published information from Huawei internal report. The document detailing the tests of Huawei and Megvii was labelled as confidential, but publicly available on Huawei’s European websites. In the meantime, it’s been removed.
The report claims that the face-scanning system could trigger a “Uighur alarm”, which raised concerns that the software could help fuel China’s crackdown on the Muslim minority.
The revelation has sparked an international backlash against Huawei, which, however, objected to accusations. The company denies claims that it would develop or sell systems that identify people by their origin. It also ruled out that they would use technologies to discriminate or oppress members of any community.
Thanks to a wide range of smartphones with powerful cameras and cloud systems where user data and data of other technical tools and devices are stored, Huawei has become an important actor in the field of surveillance technologies.
It collaborated with the start-up Megvii in 2018 to test facial recognition of Megvii Face ++. It was able to scan faces in a crowd and estimate each person’s age, sex and ethnicity.
Megvii is one of the world’s largest providers of facial recognition system. It is used throughout China in many public places as well as in private buildings.
However, developers protest that such systems reflect technological advancement in the country and their expanded use can help the government to keep people safe.
Human rights defenders warn that these technologies are dangerous in wrong hands. Systems could be abused in countries that want to criminalise minorities and increase political control.
The mass expansion of devices containing similar surveillance technologies ensures the collection of sensitive data on the servers of the manufacturer, who is obliged to, according to Chinese legislation, provide them to government organisations if necessary.
The discovery also stimulated major ethical discussions among researchers in the field of artificial intelligence. They warn against discrimination, profiling and punishment by the state.
The system may not produce accurate results, as its performance would vary greatly depending on lighting, image quality and other factors. Moreover, the diversity of ethnic groups and origins of people is not clearly divided into simple clusters.
The test report from an internal report also said that the system was able to take real-time snapshots of pedestrians, analyse video files and replay the 10-second footage before and after the detection of any Uighur face.
China is not the only country where facial recognition is used. In the United States, the police tried to use some of these technologies, such as facial recognition to investigate crimes.
After the summer nationwide protests against the abuse of the system by the police, the technology was banned. In Uganda, the police and government officials used Huawei cameras to identify demonstrators and political opponents.
« Späť na zoznam