Mullvad and E2EE Messengers do not need to process the contents of the message on their server. All they do is, passing it to another computer. It could be scrambled binary for all they care. But any AI company _has_ to read the content of the message by definition of their service.
Read the marketing carefully and you will notice that there is no word about encrypted processing, just storage - and of course that's a solved problem, because it was solved decades ago.
The agent needs the data decrypted, at least for the moment, I know of no model that can process encrypted data. So as long as the model runs on a server, whoever manages that server has access to your messages while they are being processed.
EDIT: Even found an article where they acknowledge this [0]. Even though there seems to exist models/techniques that can produce output from encrypted messages with 'Homomorphic Encryption' [1], it is not practical, as it would takedays to produce an answer and it would consumes huge amounts of processing power.
Maybe, but letting aside that they are two different kind of products, how can you trust them to really do so? And in any way, in the case of ChatGPT where should I store my client side private key, as I use those bots only in my web browser? Maybe in my password manager and I copy paste it every time I start a new conversation.
My take is that if they went this way we would not be talking about them now, we would be talking about one of their competitors that didn't put hurdles between their product and their customers.
In other words, survivor bias.
I built E2E encrypted LLMs using secure enclaves, so I know a bit about this space.
The tech works, for small LLMs - the sort of thing you can run on your mobile already. It isn't yet (?) there for LLMs the size of ChatGPT.
Models should run in ephemeral containers where data is only processed in RAM. For active conversation a unique and temporary key-pair is generated. Saved chats are encrypted client side and stored encrypted server side. To resume a conversation[0], decrypt client side, establish connection to container, generate new temporary key-pair, and so on. There's more details and nuances but this is very doable.
How Mullvad handles your data, for some inspiration: https://mullvad.net/en/help/no-logging-data-policy
I'm not sure why this is a problem. There's no requirement that data at rest needs be unencrypted. Nor is there a requirement that those storing the data need to have the keys to decrypt that data. Encrypted storage is a really common thing... For this we can use the above scenario, or we can use a multi-key setting if you want to ping multiple devices, or you can have data temporarily decrypted. There is still no need to store the data to disk unencrypted or encrypted with keys OAI owns.Of course, I also don't see OAI pushing the state of Homomorphic Encryption forward either... But there's definitely a lot of research and more than acceptable solutions that allow data to be processed server side while being encrypted for as long as possible and making access to that data incredibly difficult.
Again, dive deep into how Mullvad does it. It is not possible for them to make all their data encrypted, but they make it as close to impossible to get, including by themselves. There doesn't need to be a perfect solution, but there's no real reason these companies couldn't restrict their own access to that data. There's only 2 reasons they are not doing so. Either 1) they just don't care enough about your privacy or 2) they want it for themselves. Considering how OpenAI pushes the "Scale is All You Need" narrative, and "scale" includes "data", I'm far more inclined to believe the reason is option 2.
[0] Remember, this isn't so much a conversation in the conventional sense. The LLMs don't "remember". You send them the entire chat history in each request. In this sense they are Markovian. It's not like they're tuning a model just to you. And even if they were, well we can store weights encrypted too. Doesn't matter if a whole model, LoRA, embeddings, or whatever. That can be encrypted at rest via keys OAI does not have access to.