They’d break SO MANY international and data security laws if they tried breaking into people’s OneDrive, it’d be hilarious to see the number of lawsuits they’d lose by default.
they’re probably already doing that to a smaller degree, and slightly protecting themselves with an obscure clause in their TOS. besides, you only lose lawsuits if you get caught - and churning things through AI is a great way to erase any fingerprints that identifies stolen data
an obscure clause in TOS won’t be a small print of an evil villain speech exposing their plot in clear wording. what it would be is something worded vaguely enough to make things seem like the end user technically agreed to what was being done, it could also be an “and” where you expected “or”, or an ommision of a specific thing… my point being - it’s always going to be a technicality that in case of a lawsuit would be a valid defence in the eyes of law
it very much is how it works though? show me a lawsuit someone lost before they got caught commiting a crime. and how would you even go about proving that your unpublished documents were used to train AI? even an entire life’s work of one person is just a speck in the training data, it’s impossible to definitively prove your work was stolen and used to train an AI. besides there will always be plausible deniability that the AI just made shit up that happened to look kinda like what you once wrote
an obscure clause in TOS won’t be a small print of an evil villain speech exposing their plot in clear wording. what it would be is something worded vaguely enough to make things seem like the end user technically agreed to what was being done
it’s always going to be a technicality that in case of a lawsuit would be a valid defence in the eyes of law
No. Written law always takes precedence. If they spied on your data stored in OneDrive, they’d lose by default the moment the case hit the courthouse.
As for your second paragraph: yeah, I agree. If they did that, the damage would’ve already been done. But it would kill the business once found out. The benefit is not worth the risk.
For example: you’re saying that they would use it to train AI, right?
They don’t train AI. They get a trained model from OpenAI.
They’d break SO MANY international and data security laws if they tried breaking into people’s OneDrive, it’d be hilarious to see the number of lawsuits they’d lose by default.
they’re probably already doing that to a smaller degree, and slightly protecting themselves with an obscure clause in their TOS. besides, you only lose lawsuits if you get caught - and churning things through AI is a great way to erase any fingerprints that identifies stolen data
As soon as you find proof, you have literally free money up for the taking at any court.
That’s… not how any of this works…
an obscure clause in TOS won’t be a small print of an evil villain speech exposing their plot in clear wording. what it would be is something worded vaguely enough to make things seem like the end user technically agreed to what was being done, it could also be an “and” where you expected “or”, or an ommision of a specific thing… my point being - it’s always going to be a technicality that in case of a lawsuit would be a valid defence in the eyes of law
it very much is how it works though? show me a lawsuit someone lost before they got caught commiting a crime. and how would you even go about proving that your unpublished documents were used to train AI? even an entire life’s work of one person is just a speck in the training data, it’s impossible to definitively prove your work was stolen and used to train an AI. besides there will always be plausible deniability that the AI just made shit up that happened to look kinda like what you once wrote
That means nothing. Illegal terms can’t be enforced in contracts or terms of service.
No. Written law always takes precedence. If they spied on your data stored in OneDrive, they’d lose by default the moment the case hit the courthouse.
As for your second paragraph: yeah, I agree. If they did that, the damage would’ve already been done. But it would kill the business once found out. The benefit is not worth the risk.
For example: you’re saying that they would use it to train AI, right?
They don’t train AI. They get a trained model from OpenAI.