If an organization runs a survey in 2024 on whether it should get into AI, then they’ve already bodged an LLM into the system and they’re seeing if they can get away with it. Proton Mail is a priva…
we appear to be the first to write up the outrage coherently too. much thanks to the illustrious @self
it can run locally, but Proton discourages it in their marketing, it has very high system requirements, and it requires you use a chromium-based browser (which is a non-starter for a solid chunk of Proton’s userbase). otherwise, it uses the cloud version of the feature, which works exactly like the quote describes, though Proton tries to pretend otherwise; it’s actually incredibly out of the ordinary that they pushed this feature at all without publishing anything about its threat model.
it’s unclear what happens if the feature’s enabled and set to local but you switch to a computer that can’t run the LLM. it’s also just fucked that there’s two identical versions of the same feature, but one of them exfiltrates your data.
Besides, I just don’t want AI in general, is that too much to ask?
you’re not alone. the other insulting part of this is that the vast majority of Proton’s userbase indicated they didn’t want this feature in responses to Proton’s 2024 survey, which was effectively constructed to make it impossible to say no to the LLM feature, since the feature portion of the survey was stack ranked. the blog post introducing Scribe even lies about the results of the survey — an LLM wasn’t even close to being the most requested feature.
it can run locally, but Proton discourages it in their marketing, it has very high system requirements, and it requires you use a chromium-based browser (which is a non-starter for a solid chunk of Proton’s userbase). otherwise, it uses the cloud version of the feature, which works exactly like the quote describes, though Proton tries to pretend otherwise; it’s actually incredibly out of the ordinary that they pushed this feature at all without publishing anything about its threat model.
it’s unclear what happens if the feature’s enabled and set to local but you switch to a computer that can’t run the LLM. it’s also just fucked that there’s two identical versions of the same feature, but one of them exfiltrates your data.
you’re not alone. the other insulting part of this is that the vast majority of Proton’s userbase indicated they didn’t want this feature in responses to Proton’s 2024 survey, which was effectively constructed to make it impossible to say no to the LLM feature, since the feature portion of the survey was stack ranked. the blog post introducing Scribe even lies about the results of the survey — an LLM wasn’t even close to being the most requested feature.