The 3rd party doctrine is worse than that - the data you gave is not only not yours anymore, it is not theirs either, but the governments. They're forced to act as a government informant, without any warrant requirements. They can say "we will do our very best to keep your data confidential", and contractually bind themselves to do so, but hilariously, in the Supreme Court's wise and knowledgeable legal view, this does not create an "expectation of privacy", despite whatever vaults and encryption and careful employee vetting and armed guards standing between your data and unauthorized parties.
Implying that the recourse is to change the law.
Those precedents are also fairly insane and not even consistent with one another. For example, the government needs a warrant to read your mail in the possession of the Post Office -- not only a third party but actually part of the government -- but not the digital equivalent of this when you transfer some of your documents via Google or Microsoft?
This case is also not the traditional third party doctrine case. Typically you would have e.g. your private project files on Github or something which Github is retaining for reasons independent of any court order and then the court orders them to provide them to the court. In this case the judge is ordering them to retain third party data they wouldn't have otherwise kept. It's not clear what the limiting principle there would be -- could they order Microsoft to retain any of the data on everyone's PC that isn't in the cloud, because their system updater gives them arbitrary code execution on every Windows machine? Could they order your home landlord to make copies of the files in your apartment without a warrant because they have a key to the door?
My understanding is it's closer to something like: They cannot order a company to create new tools, but can tell them to not destroy the data they already have. So, in the question of MS having the ability to create a tool that extracts your data is not the same as MS already having that tool functioning and collecting all of your data that they store and are then told to simply not destroy. Similarly, VPNs that are not set-up to create logs can't keep or hand over what they don't have.
Laws can be made to require the collection and storage of all user data by every online company, but we're not there -- yet. Many companies already do it on their own, and the user then decides if that's acceptable or not to continue using that service.
If the company created their service to not have the data in the first place, this probably never would have found its way to a judge. Their service would cost more, be slower, and probably be difficult to iterate on as it's easier to hack things together in a fast moving space then build privacy/security first solutions.
"the data they already have" means the data the user gave the company (no one is "giving" their files to their landlord) and that the company is in full possession of and now owns. Users in this case are not in possession or ownership of the data they gave away at this point.
If you hand out photocopies of the files in your apartment, the files in your apartment are still yours, but the copies you gave away to a bunch of companies are not. Those now belong to the company you gave them to and they can do whatever they want with it. So if they keep it and a judge tells them the documents are not to be destroyed (because laws things), they would probably get into trouble if they went against the order.
Which is what I was trying to bring attention to; the fact that the company has a choice in what data (if any) they decided to collect, possess, and own. If they never collected/stored it then no one's privacy would be threatened.
https://en.wikipedia.org/wiki/Third-party_doctrine#:~:text=w...
If OpenAI doesn't succeed at oral argument, then in theory they could try for an appeal either under the collateral order doctrine or seeking a writ of mandamus, but apparently these rarely succeed, especially in discovery disputes.
To prevent that you need Congress to tell them no, but that creates a sort of priority inversion: The machinery designed to stop the government from doing something bad unless there is consensus is then enabling government overreach unless there is consensus to stop it. It's kind of a design flaw. You want checks and balances to stop the government from doing bad things, not enable them.
> once you voluntarily give your data to a third party-- e.g. when you sent it to OpenAI-- it's not yours anymore and you have no reasonable expectation of privacy about it.
sorry for the layperson question, but does this apply then to my company's storage of confidential info on say google drive, even with an enterprise agreement?Furthermore, if the third party doctrine is upheld in its most naïve form, then this would breach the EU-US Data Privacy Framework. The US must ensure equivalent privacy protections to those under the GDPR in order for the agreement to be valid. The agreement also explicitly forbids transferring information to third parties without informing those whose information is transferred.
The magistrate is suggesting that there is no reasonable expectation of privacy in chats OpenAI agreed to delete, at the request of users. This is bizarre, because there's no way for OpenAI to use data that is deleted. It's gone. It doesn't require abrogation of US law, it requires a sensible judge to sit up and recognize they just infringed on the privacy expectations of millions of people.
Like it or not, the judge's ruling sits comfortably within the framework of US law as it exists at present: since there's no reasonable expectation of privacy for chat logs sent to OpenAI, there's nothing to weigh against the competing interest of the active NYT case.