As companies race to add AI, terms of service changes are going to freak a lot of people out
Analysis WeTransfer this week denied claims it uses files uploaded to its ubiquitous cloud storage service to train AI, and rolled back changes it had introduced to its Terms of Service after they deeply upset users. The topic? Granting licensing permissions for an as-yet-unreleased LLM product.
Agentic AI, GenAI, AI service bots, AI assistants to legal clerks, and more are washing over the tech space like a giant wave as the industry paddles for its life hoping to surf on a neural networks breaker. WeTransfer is not the only tech giant refreshing its legal fine print – any new product that needs permissions-based data access – not just for AI – is going to require a change to its terms of service.
In the case of WeTransfer, the passage that aroused ire was:
In a statement released during the backlash, WeTransfer insisted it had zero intention of abusing anyone’s intellectual property, saying it had made the change to cover an upcoming moderation service. It said it was merely considering the “possibility of using AI to improve content moderation and further enhance our measures to prevent the distribution of illegal or harmful content on the WeTransfer platform.”
The feature hasn’t been built or used “in practice,” but it was “under consideration,” said the file transfer tool. “To avoid confusion, we’ve removed this reference.”
But users were not happy about the wording, which has since been removed, and took to social media and The Register to say so, with one telling us: “Given that one of the common use cases is to securely transfer sensitive content between users, this is a gross violation of privacy and they should be called out until they roll back this change or lose all their customers.” Coming in for special ire was the phrase: “You will not be entitled to compensation for any use of Content by us under these Terms.”
WeTransfer, for its part, didn’t immediately need to ask for a ToS tweak. In fact, the cloud storage company told us this morning: “In retrospect, we would have excluded the mention of machine learning entirely as we don’t use machine learning or any form of AI to process content shared via WeTransfer.”
It added: “We regret that our terms caused unnecessary confusion. We recognize AI is a sensitive and important topic for the creative community that can elicit strong reactions.”
You can read the complete old and new clause, and a fuller explanation from the company, here.
Speaking to us about ToS tweaks more generally, specialist senior solicitor Neil Brown, who runs tech-savvy English law firm decoded.legal, told us: “In terms of the company, if it wants to do something which requires more permissions from the user (e.g. a copyright licence) than its terms currently provide, then the company may well reasonably conclude that the best thing to do, for its own protection, is to ensure that the rights it is granted under its contract with its users covers what it needs to do.”
When we asked whether cloud services generally need permissions when it comes to copyright just to store and process files, Brown told The Reg: “I can’t speak for all jurisdictions / regimes around the world, and the position may vary, but at least in the UK, we have the notion of an ‘implied license.’
“So if a company provided – say – a hosted storage offering, and did not seek an explicit grant of a licence from the user for any copying inherent in providing that service, the company is likely to claim that, nevertheless, it had an implied license from the user for this purpose.”
He added: “The challenge with implied licenses comes down to mis-matching expectations: is what the company wants to do with the copyright works what the user intends the company to do? If not, there is the possibility of the user claiming that the company has acted unlawfully – that is has infringed the user’s copyright.
“So, in practice, most companies will try to include some sort of language in their terms which grants them all rights necessary to provide the services, or something like that.”
He added: “Some organizations will try to be more specific, but others will see that as a potential barrier to changing the services, if they would then also need to change their terms of service.”
But since techies tend to watch these things closely, being told explicitly what is happening matters, and seeing a change without a full explanation can create more trouble for companies.
Back in 2023, WeTransfer’s file-sharing rival, Dropbox, also had to fend off claims it was using uploaded files to train LLMs when a customer – a user named Werner Vogels who also happens to be the CTO of Amazon – noticed a toggle switch users could opt into to “use AI from third-party partners” to “work faster in Dropbox.”
After the backlash, Dropbox CEO Drew Houston set Vogels straight, responding: “Third-party AI services are only used when customers actively engage with Dropbox AI features which themselves are clearly labeled.” Nonetheless, as The Register said at the time, the move fed into the so-called “AI trust crisis,” where developer Simon Willison mused that many people no longer trust what big tech or AI entities say.
As Willison says, trust matters. “People both overestimate and underestimate what companies are doing, and what’s possible. This isn’t helped by the fact that AI technology means the scope of what’s possible is changing at a rate that’s hard to appreciate even if you’re deeply aware of the space.”
Others took issue with more than just the legal aspects, with open standards boffin Terence Eden saying that maybe netizens should stop flinging files at each other altogether. In a post he titled “We’ve got to stop sending files to each other,” he writes:
®
READ MORE HERE