China lays out ground rules to stem deepfake abuse

China has laid out ground rules to prevent “deep synthesis” technology, including deepfakes and virtual reality, from being abused. Anyone using these services must label the images accordingly and refrain from tapping the technology for activities that breach local regulations. 

Cyberspace Administration of China, Ministry of Industry and Information Technology, and Ministry of Public Security released a joint statement mandating the use of deep synthesis technology and services must be clearly indicated, so these are not mistaken to represent real information.

To be effective from January 10 next year, the new rules aim to protect national security and the country’s core social values, as well as safeguard the rights and interests of citizens and organisations, said the government agencies.

They noted that while synthesis technology had improved user experience, it also was used to impersonate identifies and disseminate false and harmful information that tarnish victims’ reputation. This endangered national security and social stability.

They added that regulations were necessary to mitigate such risks and drive the “healthy” development of new technology. The ground rules also would standardise the development of deep synthesis services and ensure these were in line with the country’s other related regulations, including data security and personal information protection laws. 

The new rules will apply to technology that use deep learning, virtual reality, and other synthetic algorithms to create text, images, video, audio, and virtual scenes, including text-to-speech, voice editing, gesture manipulation, digital simulation, and 3D reconstruction. 

Apart from not using deep synthesis services to produce and disseminate information prohibited by local laws, the new regulations also outline the need to implement a real identity data authentication system as well as other management systems, such as user registration, algorithm mechanism review, data security, emergency response, and ethics review. In addition, safety technical measures must be established. 

These management rules and service agreements must be disclosed. Users also will have to put in place mechanisms to address rumours in a timely manner, should the use of deep synthesis services be used to publish or disseminate false information. The relevant government agencies will need to be notified, too. 

RELATED COVERAGE

READ MORE HERE