Coronavirus tracing tech policy ‘more significant’ than the war on encryption

Tech-savvy individuals and firms have been eager to apply their skills to the coronavirus pandemic, as they should be. Some of them are working with governments who have flexed their “special powers” and public health muscles, as governments should do.

Much of this tech effort, from all sides, has been put into contact tracing, which aims to find out who might have been exposed to the virus from an infectious person.

Contact tracing is already a routine process in most developed nations for battling things like meningococcal disease, tuberculosis, and sexually transmitted infections (STIs), including HIV.

Normally, this “painstaking and quick detective work” is labour-intensive and involves lots of phone calls and text messages. The new technologies that are being developed intend to improve that.

Australia’s plan to adopt TraceTogether, the COVID-19 contract tracing app from Singapore, is one obvious example.

The remarkable partnership between Apple and Google to roll out APIs to enable contact tracing apps is another.

But how many of these players are thinking about the long-term implications?

TraceTogether’s creators seem to have made a solid effort to protect users’ privacy from each other. The co-called “Central Authority” server generates temporary IDs which are periodically refreshed, for example.

The data log only contains relative distance between users, as determined by the Bluetooth signal strength, not the exact location where the users came in close contact.

But a detailed analysis by researchers from the University of Melbourne and Macquarie University highlights a range of privacy flaws.

One key problem is that users must trust the Central Authority — in Singapore, that’s a Ministry of Health server — to do the right thing.

“Even though the data logs are only sent to the Central Authority following user’s consent, there is no check to ensure that the request from Central Authority is genuine or not, i.e., whether that user was in proximity of an infected user,” the researchers wrote.

“Thus, a curious Central Authority might be able to obtain and decrypt data logs from a large number of users yielding to [a] potential mass-surveillance threat.”

While the data logs held locally on users devices are deleted after 21 days, there’s no guarantee that the data logs decrypted at the authority server would also be deleted.

The ever-present risk of re-identification

As well as tweaks to provide more protection from the Central Authority, and less centralisation, the researchers also recommend that any future use of anonymised data logs “must be restricted”.

“An important aspect of data gathered by the server is future use by epidemiologists and policymakers,” they wrote.

“Although the information seems innocuous, it can be very sensitive and reveal a lot about the users.”

The privacy of medical information is particularly important.

As the Australasian Contact Tracing Guidelines remind us, any disclosure that individuals have tested for, or are living with, such as HIV/AIDS or other STIs, can invite social stigma and discrimination.

“People may be reluctant to seek medical attention if they fear their information could be disclosed to others. This ‘chilling effect’ could have implications for the future prevention, treatment and study of medical conditions.”

These risks are also present with COVID-19. Australia is already seeing racist vandalism and physical and verbal abuse. If specific individuals are ever identified, their situation would only get worse.

For this reason, the researchers say that the data shouldn’t be made public, even if anonymised.

“A large percentage of the people might share their data. Even the contact graph, without locations, timestamps, phone numbers or explicit identities, can be linked to other data sources enabling user re-identification.”

In fact, another University of Melbourne team found such a vulnerability with a supposedly anonymised public dataset in 2016 and had re-identified seven prominent Australians in 2017.

The government didn’t really fix the problem, however. They just made data matching illegal.

Digital rights activists push back

Digital Rights Watch Australia (DRW) has called for more transparency about the planned use of TraceTogether, along with “unimpeachable guarantees” that the data won’t be used for anything else.

“They certainly need to do better than suggesting that privacy implications will be examined by the Attorney-General,” said DRW chair Lizzie O’Shea on Wednesday.

“Everything about this needs to be transparent. The code must be independently audited. There needs to be a clear benchmark for when data will no longer be collected and the app deactivated.”

O’Shea noted, as others have, that there’s a real risk of false positives and a need to preserve human rights even in the face of a pandemic.

“The existence of encryption-breaking laws like the government’s own Assistance and Access [Act] undermines our capacity to keep such systems secure,” she said.

“Such technological tools need a social licence to operate effectively, and the government has a long way to go before it comes close to earning it.”

In a global context, Dr TJ McIntyre, an associate professor in the Sutherland School of Law at University College Dublin, went further.

“COVID-19 tracing is the most significant technology policy development of this generation — even more so than the war against end to end cryptography — and we’re watching it happen at breakneck speed,” McIntyre said.

“The role of tech firms vs states will be critical.”

So where to from here?

Genevieve Bell, director of the 3A Institute at the Australian National University wrote that the response to the coronavirus presents a chance to reinvent the way we collect and share personal data while protecting individual privacy.

“The speed of the virus and the response it demands shouldn’t seduce us into thinking we need to build solutions that last forever,” Bell wrote.

“There’s a strong argument that much of what we build for this pandemic should have a sunset clause — in particular when it comes to the private, intimate, and community data we might collect.”

Of course, once governments gain certain powers or access to certain technologies, very rarely do they hand them back with a friendly “Thanks, we don’t need that any more”.

In fact, the opposite happens. There is always scope creep.

What makes the current situation in Australia even more worrisome is that TraceTogether has been fast-tracked through the review process at a time when Parliament and its various oversight committees have been shut down.

Yes, we need to fight the coronavirus with extraordinary measures, but we also need to have our wits about us.

Coronavirus Updates

READ MORE HERE