Morrison sells Australia’s terrorism video streaming plan to the G20

Led by Australia, the G20 nations have urged online platforms to “meet our citizens’ expectations” to prevent terrorist and violent extremism conducive to terrorism (VECT) content from being streamed, uploaded, or re-uploaded.

“Platforms have an important responsibility to protect their users,” read the Leaders’ Statement [PDF] issued in Osaka on Saturday.

“The complexity of the challenge — and increasing sophistication of the criminals who would misuse the internet — does not lessen the importance of platforms mitigating the proliferation of terrorist and VECT content, which harms society, via their platforms,” it said.

“We issue this statement to raise the bar of expectation for online platforms to do their part.”

The G20 leaders’ statement comes in the wake of the Christchurch terrorist attack being live-streamed and replicated on Facebook and elsewhere on 15 March.

“The internet must not be a safe haven for terrorists to recruit, incite or prepare terrorist acts. To this end, we urge online platforms to adhere to the core principle, as affirmed in Hamburg, that the rule of law applies online as it does offline,” they said.

“This must be achieved in a way that is consistent with national and international law, including human rights and fundamental freedoms such as freedom of expression and access to information — we hold these in high regard.”

Online platforms have been urged to “step up the ambition and pace of their efforts”.

“We strongly encourage a concerted effort to set out, implement and enforce terms of service to detect and prevent terrorist and VECT content from appearing on their platforms,” the leaders said.

Australian Prime Minister Scott Morrison reiterated his framing of the statement as being compatible with human rights, having said previously that it was not about constraining free speech.

“I think the way we have pursued this was to keep it very focused on what it was trying to achieve. We didn’t broaden it out. This was about simply trying to ensure that we all were agreed that the internet should not become a weapon of terrorists,” Morrison told journalists on Saturday.

“This sends a clear message, and the impetus of this is to say to the companies, ‘You have the technology, you have the innovation, you have the resources, and now you have the clear communicated will of the world’s leaders that you need to apply that to get this right.'”

Australian government taskforce recommends action plan

Australia’s legislative and political response to the Christchurch massacre has been swift and draconian.

Morrison held a summit on 26 March, just 11 days after the event, with representatives from Facebook, YouTube, Amazon, Microsoft, and Twitter, along with Telstra, Vodafone, TPG, and Optus, as well as senior ministers and the heads of relevant government agencies.

Nine days after, the Criminal Code Amendment (Unlawful Showing of Abhorrent Violent Material) Act 2019 [PDF], often described as “social media” laws, was rammed through parliament with almost zero debate.

The new law triggered consternation in the tech industry, with particular concern over the criminal penalties for platforms that didn’t take down the abhorrent material “expeditiously”.

The summit also led to the creation of the Australian Taskforce to Combat Terrorist and Extreme Violent Material Online. The new task force released a 13-page report on Sunday, providing recommendations on how to prevent such content from coming online.

The report’s action plan and recommendations fall into five streams: prevention; detection and removal; transparency; deterrence; and capacity building.

Online platforms are expected to develop further technical solutions, and work with other members of the Global Internet Forum to Counter Terrorism (GIFCT) to strengthen their hash-sharing and URL-sharing databases. 

Platforms should “review the operation of algorithms and other processes that may drive users towards (or amplify) terrorist and extreme violent material to better understand possible intervention points, and to implement changes where this occurs”.

“This may include using algorithms and other processes to redirect users from such content, or the promotion of credible, positive alternatives or counter-narratives,” the report said.

Platforms should also have “clear, efficient appeals mechanisms” that give users the ability to challenge moderation decisions.

Moderation processes should “implement visible and intuitive user reporting mechanisms and minimise friction for users in reporting problematic content”, and “assign the highest level of priority (similar to that for other abhorrent content such as child abuse) to the triaging and moderation of terrorist and extreme violent material”.

The recommendations for live-streaming controls include things such as strengthening account validation processes, and limiting the ability of new users to live-stream until they have established a pattern of behaviour.

The latter might include “cooling off periods” before a new user can live-stream, limiting audience size, implementing streamer ratings or scores, or monitoring account activity.

The taskforce also recommends for new laws to be drafted that establish a “content blocking framework” for terrorist and extreme violent material online during crisis events.

The eSafety Commissioner should, in consultation with Communications Alliance, develop a “protocol to govern the interim use” of subsection 581(2A) of the Telecommunications Act 1997 during an online crisis event.

The report recommends running a “testing event” in 2019-20 to gauge the capability of the industry and government to respond to a simulated scenario. The test would be managed by the Australia-New Zealand Counter-Terrorism Committee.

Further recommendations cover the creation of a new “online crisis response protocol”, user account management, industry-government collaboration, proposals to reform the governance and structural arrangements of the GIFCT; the monitoring and review of terrorist and extremist organisations, and reporting.

Taskforce retains narrow definition of banned content

The taskforce’s definition of terrorist and extreme violent material mirrors that in the Unlawful Showing of Abhorrent Violent Material Act.

It’s defined as “audio, visual or audio-visual material” that:

  • depicts an actual terrorist act targeting civilians (as opposed to animated or fictionalised);
  • depicts actual (as opposed to animated or fictionalised) violent crime; or
  • promotes, advocates, encourages or instructs a terrorist, terrorist group or terrorist act, or a person to commit actual (as opposed to animated or fictionalised) violent crime.

This definition is a subset of “abhorrent violent material” as defined in the Act:

  • a terrorist act (involving serious physical harm or death, and otherwise within the meaning of 100.1 of the Criminal Code Act 1995);
  • the murder of another person;
  • the attempted murder of another person;
  • the torture of another person;
  • the rape of another person; or
  • kidnapping involving violence.

“Abhorrent violent material” is restricted to material recorded or streamed by the perpetrator or their accomplice, not the media or bystanders attempting to document the violence. 

Related Coverage

Facebook Live to be restricted following Christchurch attack

Introducing a ‘one strike’ policy and a ban for a set period of time to those that violate community standards.

Christchurch Call: USA missing from 26 member pledge to eliminate violent online content

Two months after the horrific Christchurch terrorist attack, New Zealand Prime Minister Jacinda Ardern has gathered the support of 18 nations and eight tech companies in her bid to stop the internet from being used as a tool for terrorists.

Microsoft’s Brad Smith asks tech industry to do better following Christchurch attack

The Microsoft veteran has said the tech industry needs to do more to prevent horrific uses of technology.

Telstra ditches unlimited data in mobile plan shake-up

Unlimited data axed as telco switches to monthly commitments for connectivity.

Online predators and social media platforms on Morrison’s agenda

The current government wants to bring online crime in-line with offline crime where offences against children are concerned, and will make social media platforms adhere to a new Online Safety Act.

READ MORE HERE