Can you code a way to foil online terrorist vids? The Home Office might just have £600K for you

UK prime minister (at time of writing) Boris Johnson announced to the UN Security Council today a plan to block the sharing of violent videos on social media after terrorist attacks.

The announcement specifically references the attack in March on a mosque in Christchurch, New Zealand, which killed 51 people. The attacker live-streamed his actions, leading to hundreds of versions rapidly spreading across online platforms. Facebook alone had to remove more than 1.5 million uploads in what was largely a complex and time-consuming search-and-delete operation.

The Home Office is to make £600,000 available to develop “industry-wide technology” for automatically identifying online videos containing not just the original violent content but copies that have been edited to evade filters.

“UK data-science experts, supported by the Home Office, will use the new funding to create an algorithm which any technology company in the world can use, free of charge, to improve the way that they detect violent and harmful videos and prevent them being shared by their users,” read the announcement.

It turns out these “data-science experts” have yet to be appointed. The Home Office confirmed to The Reg that the £0.6m fund is being dangled as a prize that will be opened to competitive bids from UK tech companies later this year.

“We’re hoping to deliver the funding by 15 March 2020,” a spokesperson told us. “That’ll be the anniversary of the Christchurch attacks.”

In an official statement, Home Secretary Priti Patel said: “The UK has a track record of showing that state-of-the-art technology can be developed, in partnership with industry, at relatively low cost and this is just the latest example of our commitment to working with industry to tackle our shared challenges and respond to the ever evolving threats which we face.”

Those considering making a bid should be aware that it is the government’s intention that whatever method is devised to block the violent viral videos can be plugged into software from other tech companies as part of a concerted worldwide effort to stop the problem. In Paris back in May, world leaders signed up to a Christchurch Call to Action to tackle terrorist use of the internet, and there is a feeling that their shared commitment should produce sharable results.

The Home Office admitted that the research might also be used to help spot other types of harmful content such as child sexual abuse.

But will chucking £600,000 at the problem be enough? Bigger sums have been spent on achieving less and, despite Patel’s bluster, the UK government’s track record on outsourcing successful IT projects at low cost has been less than glowing.

Commenting on the announcement, Paul Bischoff, privacy advocate at Comparitech.com, said: “The sceptic in me thinks this is just a gesture of goodwill and not a serious attempt at censoring terrorist videos.

“Even if you don’t think Facebook does a good enough job, it still has a huge advantage in that it’s been working on this problem for some time, has massive amounts of data to test with, and has a business incentive to get it right.

“I think we should also question the efficacy of censorship. The fear is that these videos will influence other potential terrorists into taking action. But I think it’s equally valid to say that such videos will incite anti-terrorism sentiment as well.

“Government censorship is a slippery slope that can lead to totalitarianism, so we must tread carefully.” ®

Sponsored: Technical Overview: Exasol Peek Under the Hood

READ MORE HERE