• Red_October@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    1 day ago

    Okay but are any AI chatbots really open source? Isn’t half the headache with LLMs the fact that there comes a point where it’s basically impossible for even the authors to decode the tangled madness of their machine learning?

    • lefixxx@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      1 day ago

      Yeah but you don’t open source the LLM, you open source the training code and the weights and the specs/architecture

      • nymnympseudonym@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        1 day ago

        what do you think an LLM is? once you’ve opened the weights, IMO it’s pretty open. Once they open the training data, that’s pretty damn open. What do you want a gitian reproducible build?

  • archchan@lemmy.ml
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    1
    ·
    edit-2
    2 days ago

    There’s some good discussion about the security in the comments, so I’m just going to say that Lumo’s Android app required the Play Store and GPlay Services. I uninstalled.

    It’s also quite censored. I gave Proton’s cute chatbot a chance, but I’m not impressed.

  • brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    54
    arrow-down
    1
    ·
    edit-2
    2 days ago

    First of all…

    Why does an email service need a chatbot, even for business? Is it an enhanced search over your emails or something? Like, what does it do that any old chatbot wouldn’t?

    EDIT: Apparently nothing. It’s just a generic Open Web UI frontend with Proton branding, a no-logs (but not E2E) promise, and kinda old 12B-32B class models, possibly finetuned on Proton documentation (or maybe just a branded system prompt). But they don’t use any kind of RAG as far as I can tell.

    There are about a bajillion of these, and one could host the same thing inside docker in like 10 minutes.

    …On the other hand, it has no access to email I think?

    • WhyJiffie@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      3
      ·
      2 days ago

      Why does an email service need a chatbot, even for business?

      they are not only an email service, for quite some time now

      There are about a bajillion of these, and one could host the same thing inside docker in like 10 minutes.

      sure, with a thousand or two dollars worth of equipment and then computer knowledge. Anyone could do it really. but even if not, why don’t they just rawdog deepseek? I don’t get it either

      …On the other hand, it has no access to email I think?

      that’s right. you can upload files though, or select some from your proton drive, and can do web search.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        edit-2
        2 days ago

        sure, with a thousand or two dollars worth of equipment and then computer knowledge. Anyone could do it really. but even if not, why don’t they just rawdog deepseek? I don’t get it either

        What I mean is there are about 1000 different places to get 32B class models via Open Web UI with privacy guarantees.

        With mail, vpn, (and some of their other services?) they have a great software stack and cross integration to differentiate them, but this is literally a carbon copy of any Open Web UI service… There is nothing different other than the color scheme and system prompt.

        I’m not trying to sound condescending, but it really feels like a cloned “me too,” with the only value being the Proton brand and customer trust.

          • ddh@lemmy.sdf.org
            link
            fedilink
            English
            arrow-up
            3
            ·
            2 days ago

            Retrieval-augmented generation (RAG) is a technique that enables large language models (LLMs) to retrieve and incorporate new information. With RAG, LLMs do not respond to user queries until they refer to a specified set of documents. These documents supplement information from the LLM’s pre-existing training data. This allows LLMs to use domain-specific and/or updated information that is not available in the training data. For example, this helps LLM-based chatbots access internal company data or generate responses based on authoritative sources.

            From Retrieval-augmented generation.

            Specifically here, I imagine the idea is to talk to the chatbot about what’s in your documents.

  • digger@lemmy.ca
    link
    fedilink
    English
    arrow-up
    212
    arrow-down
    4
    ·
    2 days ago

    How much longer until the AI bubbles pops? I’m tired of this.

    • wewbull@feddit.uk
      link
      fedilink
      English
      arrow-up
      40
      ·
      2 days ago

      It’s when the coffers of Microsoft, Amazon, Meta and investment banks dry up. All of them are losing billions every month but it’s all driven by fewer than 10 companies. Nvidia is lapping up the money of course, but once the AI companies stop buying GPUs on crazy numbers it’s going to be a rocky ride down.

      • astanix@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        2 days ago

        Is it like crypto where cpus were good and then gpus and then FPGAs then ASICs? Or is this different?

        • steelrat@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          24 hours ago

          Wildly different, though similar in that ASIC was tuned to specific crypto tasks, everyones making custom silicon for neural nets and such.

          I wouldn’t plan on it going away. Apple put optimized neural net chips in their last phone. Same with Samsung.

        • wewbull@feddit.uk
          link
          fedilink
          English
          arrow-up
          17
          ·
          2 days ago

          I think it’s different. The fundamental operation of all these models is multiplying big matrices of numbers together. GPUs are already optimised for this. Crypto was trying to make the algorithm fit the GPU rather than it being a natural fit.

          With FPGAs you take a 10x loss in clock speed but can have precisely the algorithm you want. ASICs then give you the clock speed back.

          GPUs are already ASICS that implement the ideal operation for ML/AI, so FPGAs would be a backwards step.

        • brucethemoose@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          2 days ago

          If bitnet or some other technical innovation pans out? Straight to ASICs, yeah.

          Future smartphone will probably be pretty good at running them.

        • cley_faye@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          2 days ago

          It’s probably different. The crypto bubble couldn’t actually do much in the field of useful things.

          Now, I’m saying that with a HUGE grain of salt, but there are decent application with LLM (let’s not call that AI). Unfortunately, these usages are not really in the sight of any business putting tons of money into their “AI” offers.

          I kinda hope we’ll get better LLM hardware to operate privately, using ethically sourced models, because some stuff is really neat. But that’s not the push they’re going for for now. Fortunately, we can already sort of do that, although the source of many publicly available models is currently… not that great.

          • KumaSudosa@feddit.dk
            link
            fedilink
            English
            arrow-up
            9
            arrow-down
            2
            ·
            2 days ago

            LLMs are absolutely amazing for a lot of things. I use it at work all the time to check code blocks or remembering syntax. It is NOT and should NOT be your main source of general information and we collectively have to realise how problematic and energy consuming they are.

          • Zos_Kia@lemmynsfw.com
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            2 days ago

            There’s absolutely a push for specialized hardware, look up that company called Groq !

            • KingRandomGuy@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 days ago

              Yes, but at this point most specialized hardware only really work for inference. Most players are training on NVIDIA GPUs, with the primary exception of Google who has their own TPUs, but even these have limitations compared to GPUs (certain kinds of memory accesses are intractably slow, making them unable to work well for methods like instant NGP).

              GPUs are already quite good, especially with things like tensor cores.

    • cley_faye@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      ·
      2 days ago

      We’re still in the “IT’S GETTING BILLIONS IN INVESTMENTS” part. Can’t wait for this to run out too.

    • Defaced@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      2 days ago

      Here’s the thing, it kind of already has, the new AI push is related to smaller projects and AI agents like Claude Code and GitHub copilot integration. MCP’s are also starting to pick up some steam as a way to refine prompt engineering. The basic AI “bubble” popped already, what we’re seeing now is an odd arms race of smaller AI projects thanks to companies like Deepseek pushing the AI hosting costs so low that anyone can reasonably host and tweak their own LLMs without costing a fortune. It’s really an interesting thing to watch, but honestly I don’t think we’re going to see the major gains that the tech industry is trying to push anytime soon. Take any claims of AGI and OpenAI “breakthroughs” with a mountain of salt, because they will do anything to keep the hype up and drive up their stock prices. Sam Altman is a con man and nothing more, don’t believe what he says.

      • hobovision@mander.xyz
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        2 days ago

        You’re saying th AI bubble has popped because even more smaller companies and individuals are getting in on the action?

        Thats kind of the definition of a bubble actually. When more and more people start trying to make money on a trend that doesn’t have that much real value in it. This happened with the dotcom bubble nearly the same. It wasn’t that the web/tech wasn’t valuable, it’s now the most valuable sector of the world economy, but at the time the bubble expanded more was being invested than it was worth because no one wanted to miss out and it was accessible enough almost anyone could try it out.

        • Defaced@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          2 days ago

          I literally said exactly what you’re explaining. I’m not sure what you’re trying to accomplish here…

          • EldritchFemininity@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            4
            ·
            2 days ago

            What they’re saying is that you said that the bubble has kinda already popped because (insert description of the middle of the dot com bubble here when smaller companies began to join in). Based on that, the bubble hasn’t popped at all, small companies are just able to buy in as well before the collapse hits.

            • nymnympseudonym@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 day ago

              It’s like the Internet in 1998

              Pets.com hasn’t gone but yet, but it will.

              The bubble will burst. BUT … the entire world will run on this new technology, nobody will imagine living without it, and multibillion dollar companies will profit and be created from it

    • kepix@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      1 day ago

      as long as certain jobs and tasks can be done easier, and searches can be done faster, its gonna stay. not a fad like nft. the bubble here is the energy and water consumption part.

      • kadu@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        20 hours ago

        as long as certain jobs and tasks can be done easier, and searches can be done faster

        I’m still waiting for somebody to prove any of these statements are true. And I say that as somebody working in a company that demands that several employees use AI - all I see is that they now take extra time manually fixing whatever bad output the LLM produced, and slowly losing their ability to communicate without first consulting ChatGPT, which is both slow and concerning.

    • rozodru@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      2 days ago

      depends on what and with whom. based on my current jobs with smaller companies and start ups? soon. they can’t afford the tech debt they’ve brought onto themselves. big companies? who knows.

    • systemglitch@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      3
      ·
      2 days ago

      Time to face the facts, this utter shit is here to stay, just like every other bit of enshitification we get exposed to.

  • DreamlandLividity@lemmy.world
    link
    fedilink
    English
    arrow-up
    107
    arrow-down
    14
    ·
    edit-2
    2 days ago

    The worst part is that once again, proton is trying to convince its users that it’s more secure than it really is. You have to wonder what else they are lying or deceiving about.

    • hansolo@lemmy.today
      link
      fedilink
      English
      arrow-up
      99
      arrow-down
      15
      ·
      2 days ago

      Both your take, and the author, seem to not understand how LLMs work. At all.

      At some point, yes, an LLM model has to process clear text tokens. There’s no getting around that. Anyone who creates an LLM that can process 30 billion parameters while encrypted will become an overnight billionaire from military contracts alone. If you want absolute privacy, process locally. Lumo has limitations, but goes farther than duck.ai at respecting privacy. Your threat model and equipment mean YOU make a decision for YOUR needs. This is an option. This is not trying to be one size fits all. You don’t HAVE to use it. It’s not being forced down your throat like Gemini or CoPilot.

      And their LLM. - it’s Mistral, OpenHands and OLMO, all open source. It’s in their documentation. So this article is straight up lies about that. Like… Did Google write this article? It’s simply propaganda.

      Also, Proton does have some circumstances where it lets you decrypt your own email locally. Otherwise it’s basically impossible to search your email for text in the email body. They already had that as an option, and if users want AI assistants, that’s obviously their bridge. But it’s not a default setup. It’s an option you have to set up. It’s not for everyone. Some users want that. It’s not forced on everyone. Chill TF out.

      • DreamlandLividity@lemmy.world
        link
        fedilink
        English
        arrow-up
        28
        arrow-down
        9
        ·
        edit-2
        2 days ago

        Their AI is not local, so adding it to your email means breaking e2ee. That’s to some extent fine. You can make an informed decision about it.

        But proton is not putting warning labels on this. They are trying to confuse people into thinking it is the same security as their e2ee mails. Just look at the “zero trust” bullshit on protons own page.

        • youmaynotknow@lemmy.zip
          link
          fedilink
          English
          arrow-up
          41
          arrow-down
          2
          ·
          edit-2
          2 days ago

          Where does it say “zero trust” ‘on Protons own page’? It does not say “zero-trust” anywhere, it says “zero-access”. The data is encrypted at rest, so it is not e2ee. They never mention end-to-end encryption for Lumo, except for ghost mode, and they are talking about the chat once it’s complete and you choose to leave it there to use later, not about the prompts you send in.

          Zero-access encryption

          Your chats are stored using our battle-tested zero-access encryption, so even we can’t read them, similar to other Proton services such as Proton MailProton Drive, and Proton Pass. Our encryption is open source and trusted by over 100 million people to secure their data.

          Which means that they are not advertising anything they are not doing or cannot do.

          By posting this disinformation all you’re achieving is getting people to pedal back to all the shit services out there for “free” because many will start believing that privacy is way harder than it actually is so ‘what’s the point’ or, even worse, no alternative will help me be more private so I might as well just stop trying.

        • hansolo@lemmy.today
          link
          fedilink
          English
          arrow-up
          16
          arrow-down
          6
          ·
          2 days ago

          My friend, I think the confusion stems from you thinking you have deep technical understanding on this, when everything you say demonstrates that you don’t.

          First off, you don’t even know the terminology. A local LLM is one YOU run on YOUR machine.

          Lumo apparently runs on Proton servers - where their email and docs all are as well. So I’m not sure what “Their AI is not local!” even means other than you don’t know what LLMs do or what they actually are. Do you expect a 32B LLM that would use about a 32GB video card to all get downloaded and ran in a browser? Buddy…just…no.

          Look, Proton can at any time MITM attack your email, or if you use them as a VPN, MITM VPN traffic if it feels like. Any VPN or secure email provider can actually do that. Mullvad can, Nord, take your pick. That’s just a fact. Google’s business model is to MITM attack your life, so we have the counterfactual already. So your threat model needs to include how much do you trust the entity handling your data not to do that, intentionally or letting others through negligence.

          There is no such thing as e2ee LLMs. That’s not how any of this works. Doing e2ee for the chats to get what you type into the LLM context window, letting the LLM process tokens the only way they can, getting you back your response, and getting it to not keep logs or data, is about as good as it gets for not having a local LLM - which, remember, means on YOUR machine. If that’s unacceptable for you, then don’t use it. But don’t brandish your ignorance like you’re some expert, and that everyone on earth needs to adhere to whatever “standards” you think up that seem ill-informed.

          Also, clearly you aren’t using Proton anyway because if you need to search the text of your emails, you have to process that locally, and you have to click through 2 separate warnings that tell you in all bold text “This breaks the e2ee! Are you REALLY sure you want to do this?” So your complaint about warnings is just a flag saying you don’t actually know and are just guessing.

          • DreamlandLividity@lemmy.world
            link
            fedilink
            English
            arrow-up
            10
            arrow-down
            11
            ·
            edit-2
            2 days ago

            A local LLM is one YOU run on YOUR machine.

            Yes, that is exactly what I am saying. You seem to be confused by basic English.

            Look, Proton can at any time MITM attack your email

            They are not supposed to be able to and well designed e2ee services can’t be. That’s the whole point of e2ee.

            There is no such thing as e2ee LLMs. That’s not how any of this works.

            I know. When did I say there is?

            • hansolo@lemmy.today
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              3
              ·
              2 days ago

              So then you object to the premise any LLM setup that isn’t local can ever be “secure” and can’t seem to articulate that.

              What exactly is dishonest here? The language on their site is factually accurate, I’ve had to read it 7 times today because of you all. You just object to the premise of non-local LLMs and are, IMO, disingenuously making that a “brand issue” because…why? It sounds like a very emotional argument as it’s not backed by any technical discussion beyond “local only secure, nothing else.”

              Beyond the fact that

              They are not supposed to be able to and well designed e2ee services can’t be.

              So then you trust that their system is well-designed already? What is this cognitive dissonance that they can secure the relatively insecure format of email, but can’t figure out TLS and flushing logs for an LLM on their own servers? If anything, it’s not even a complicated setup. TLS to the context window, don’t keep logs, flush the data. How do you think no-log VPNs work? This isn’t exactly all that far off from that.

              • DreamlandLividity@lemmy.world
                link
                fedilink
                English
                arrow-up
                5
                arrow-down
                2
                ·
                2 days ago

                What exactly is dishonest here? The language on their site is factually accurate, I’ve had to read it 7 times today because of you all.

                I object to how it is written. Yes, technically it is not wrong. But it intentionally uses confusing language and rare technical terminology to imply it is as secure as e2ee. They compare it to proton mail and drive that are supposedly e2ee.

                • loudwhisper@infosec.pub
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  arrow-down
                  1
                  ·
                  2 days ago

                  They compare it to proton mail and drive that are supposedly e2ee.

                  Only drive is. Email is not always e2ee, it uses zero-access encryption which I believe is the same exact mechanism used by this chatbot, so the comparison is quite fair tbh.

                • hansolo@lemmy.today
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  2 days ago

                  It is e2ee – with the LLM context window!

                  When you email someone outside Proton servers, doesn’t the same thing happen anyway? But the LLM is on Proton servers, so what’s the actual vulnerability?

            • Jerkface (any/all)@lemmy.ca
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              edit-2
              2 days ago

              They are not supposed to be able to and well designed e2ee services can’t be. That’s the whole point of e2ee.

              You’re using their client. You get a fresh copy every time it changes. Of course you are vulnerable to a MITM attack, if they chose to attempt one.

              • DreamlandLividity@lemmy.world
                link
                fedilink
                English
                arrow-up
                4
                ·
                2 days ago

                If you insist on being a fanboy than go ahead. But this is like arguing a bulletproof vest is useless because it does not cover your entire body.

                • null@lemmy.nullspace.lol
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  2 days ago

                  Or because the bulletproof vest company might sell you a faulty one as part of a conspiracy to kill you.

        • loudwhisper@infosec.pub
          link
          fedilink
          English
          arrow-up
          7
          ·
          2 days ago

          Scribe can be local, if that’s what you are referring to.

          They also have a specific section on it at https://proton.me/support/proton-scribe-writing-assistant#local-or-server

          Also emails for the most part are not e2ee, they can’t be because the other party is not using encryption. They use “zero-access” which is different. It means proton gets the email in clear text, encrypts it with your public PGP key, deletes the original, and sends it to you.

          See https://proton.me/support/proton-mail-encryption-explained

          The email is encrypted in transit using TLS. It is then unencrypted and re-encrypted (by us) for storage on our servers using zero-access encryption. Once zero-access encryption has been applied, no-one except you can access emails stored on our servers (including us). It is not end-to-end encrypted, however, and might be accessible to the sender’s email service.

        • hansolo@lemmy.today
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 days ago

          SMH

          No one is saying it’s encrypted when processed, because that’s not a thing that exists.

          • wewbull@feddit.uk
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 day ago

            End to end encryption of a interaction with a chat-bot would mean the company doesn’t decrypt your messages to it, operates on the encrypted text, gets an encrypted response which only you can decrypt and sends it to you. You then decrypt the response.

            So yes. It would require operating on encrypted data.

            • hansolo@lemmy.today
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 day ago

              The documentation says it’s TLS encrypted to the LLM context window. LLM processes, and the context window output goes back via TLS to you.

              As long as the context window is only connected to Proton servers decrypting the TLS tunnel, and the LLM runs on their servers, and much like a VPN, they don’t keep logs, then I don’t see what the problem actually is here.

      • DreamlandLividity@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        6
        ·
        edit-2
        2 days ago

        Zero-access encryption

        Your chats are stored using our battle-tested zero-access encryption, so even we can’t read them, similar to other Proton services such as Proton Mail, Proton Drive, and Proton Pass.

        from protons own website.

        And why this is not true is explained in the article from the main post as well as easily figured out with a little common sense (AI can’t respond to messages it can’t understand, so the AI must decrypt them).

        • loudwhisper@infosec.pub
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          1
          ·
          2 days ago

          They actually don’t explain it in the article. The author doesn’t seem to understand why there is a claim of e2e chat history, and zero-access for chats. The point of zero access is trust. You need to trust the provider to do it, because it’s not cryptographically veritable. Upstream there is no encryption, and zero-access means providing the service (usually, unencrypted), then encrypting and discarding the plaintext.

          Of course the model needs to have access to the context in plaintext, exactly like proton has access to emails sent to non-PGP addresses. What they can do is encrypt the chat histories, because these don’t need active processing, and encrypt on the fly the communication between the model (which needs plaintext access) and the client. The same is what happens with scribe.

          I personally can’t stand LLMs, I am waiting eagerly for this bubble to collapse, but this article is essentially a nothing burger.

          • DreamlandLividity@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            2
            ·
            edit-2
            2 days ago

            You understand that. I understand that. But try to read it from the point of view of an average user that knows next to nothing about cyber security and LLMs. It sounds like it’s e2ee that proton mail and drive are famous for. To us, that’s obviously impossible but most people will interpret that marketing this way.

            It’s intentional deception, using technical terms to confuse nontechnical customers.

            • loudwhisper@infosec.pub
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              2 days ago

              How would you explain it in a way that is both nontechnical, accurate and differentiates yourself from all the other companies that are not doing something even remotely similar? I am asking genuinely because from the perspective of a user that decided to trust the company, zero-access is functionally much closer to e2ee than it is to “regular services”, which is the alternative.

              • DreamlandLividity@lemmy.world
                link
                fedilink
                English
                arrow-up
                4
                ·
                2 days ago

                The easiest is to explain the consequence.

                We can’t access your chat history retroactively, but we can start wiretapping your future chats.

                If that is too honest for you, then just explain the data is encrypted after the LLM reads them instead of using technical terms like zero access.

        • youmaynotknow@lemmy.zip
          link
          fedilink
          English
          arrow-up
          4
          ·
          2 days ago

          This I can agree on. They would have been better served and made it clearer to their users by clarifying that it is not ‘zero trust’ and not e2ee. At the end of the day, once the masses start trusting a company they stop digging deep, just read the first couple of paragraphs of the details, if at all, but some of us are always digging to make sure we can find the weakest links in our security as well as our privacy to try and strengthen them. So yeah, pretty stupid of them.

  • brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    edit-2
    2 days ago

    OK, so I just checked the page:

    https://lumo.proton.me/guest

    Looks like a generic Open Web UI instance, much like Qwen’s: https://openwebui.com/

    Based on this support page, they are using open models and possibly finetuning them:

    https://proton.me/support/lumo-privacy

    The models we’re using currently are Nemo, OpenHands 32B, OLMO 2 32B, and Mistral Small 3

    But this information is hard to find, and they aren’t particularly smart models, even for 32B-class ones.

    Still… the author is incorrect, they specify how long requests are kept:

    When you chat with Lumo, your questions are sent to our servers using TLS encryption. After Lumo processes your query and generates a response, the data is erased. The only record of the conversation is on your device if you’re using a Free or Plus plan. If you’re using Lumo as a Guest, your conversation is erased at the end of each session. Our no-logs policy ensures wekeep no logs of what you ask, or what Lumo replies. Your chats can’t be seen, shared, or used to profile you.

    But it also mentions that, as is a necessity now, they are decrypted on the GPU servers for processing. Theoretically they could hack the input/output layers and the tokenizer into a pseudo E2E encryption scheme, but I haven’t heard of anyone doing this yet… And it would probably be incompatible with their serving framework (likely vllm) without some crack CUDA and Rust engineers (as you’d need to scramble the text and tokenize/detokenize it uniquely for scrambled LLM outer layers for each request).

    They are right about one thing: Proton all but advertise Luma as E2E when that is a lie. Per its usual protocol, Open Web UI will send the chat history for that particular chat to the server for each requests, where it is decoded and tokenized. If the GPU server were to be hacked, it could absolutely be logged and intercepted.

  • cley_faye@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    5
    ·
    2 days ago

    Any business putting “privacy first” thing that works only on their server, and requires full access to plaintext data to operate, should be seen as lying.

    I’ve been annoyed by proton for a long while; they do (did?) provide a seemingly adequate service, but claims like “your mails are safe” when they obviously had to have them in plaintext on their server, even if only for compatibility with current standards, kept me away from them.

    • Encrypt-Keeper@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      3
      ·
      2 days ago

      they obviously had to have them in plaintext on their server, even if only for compatibility with current standards

      I don’t think that’s obvious at all. On the contrary, that’s a pretty bold claim to make, do you have any evidence that they’re doing this?

      • DeathByBigSad@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        6
        ·
        edit-2
        2 days ago

        Incoming Emails that aren’t from proton, or PGP encrypted (which are like 99% of emails), arrives at Proton Servers via TLS which they decrypt and then have the full plaintext. This is not some conspiracy, this is just how email works.

        Now, Proton and various other “encrypted email” services then take that plaintext and encypt it with your public key, then store the ciphertext on their servers, and then they’re supposed to discard the plaintext, so that in case of a future court order, they wouldn’t have the plaintext anymore.

        But you can’t be certain if they are lying, since they do necessarily have to have access to the plaintext for email to function. So “we can’t read your emails” comes with a huge asterisk, it onlu applies to those sent between Proton accounts or other PGP encrypted emails, your average bank statement and tax forms are all accessible by Proton (you’re only relying on their promise to not read it).

        • Encrypt-Keeper@lemmy.world
          link
          fedilink
          English
          arrow-up
          17
          ·
          edit-2
          2 days ago

          Ok yeah thats a far cry from Proton actually “Having your unencrypted emails on their servers” as if they’re not encrypted at rest.

          There’s the standard layer of trust you need to have in a third party when you’re not self hosting. Proton has proven so far that they do in fact encrypt your emails and haven’t given any up to authorities when ordered to so I’m not sure where the issue is. I thought they were caught not encrypting them or something.

          • Vinstaal0@feddit.nl
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 day ago

            We need to call for an audit on Protons policy and see if they actually do what they say, that way we can know for almost certain that everything is good as they say

            • Encrypt-Keeper@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 day ago

              I mean we know from documented events that Proton doesn’t store you emails in plain text because there have been Swiss orders to turn over information which they have to comply with and they’ve never turned in emails, because they can’t.

          • cley_faye@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            6
            ·
            2 days ago

            Ok yeah thats a far cry from Proton actually “Having your unencrypted emails on their servers” as if they’re not encrypted at rest.

            See my other reply. There is no way to retrieve your mail using IMAP on a regular client if they’re encrypted on the server. And Gmail can retrieve your mails from proton using IMAP. It’s even in their own (proton’s) documentation.

            • nymnympseudonym@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 day ago

              Agreed.

              Really, if someone wants to use an LLM, the right place to run it is in a sandbox locally on your own computer

              Anything else is just a stupid architecture. You don’t run your Second Brain on Someone Else’s Computer

            • Encrypt-Keeper@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              ·
              2 days ago

              There is no way to retrieve your mail using IMAP on a regular client if they’re encrypted on the server.

              That is probably why you can’t retrieve your emails using IMAP from a regular client.

              And Gmail can retrieve your mails from proton using IMAP. It’s even in their own (proton’s) documentation.

              I don’t think it can. Where in the documentation did you find that?

              • cley_faye@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                edit-2
                2 days ago
                And Gmail can retrieve your mails from proton using IMAP. It’s even in their own (proton’s) documentation.
                

                I don’t think it can. Where in the documentation did you find that?

                An online search brought me here : https://www.getmailbird.com/setup/en/access-protonmail-com-via-imap-smtp which did looks like a documentation page about how to do exactly that. Obviously, it has nothing to do with them, and the actual details makes no sense the lower you get in the page. I’ve been had :)

                They still can see most mails transit from their service in plaintext in both directions, though, which remain a privacy issue, but it has more to do with email protocols than anything.

                • Encrypt-Keeper@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 day ago

                  You’re right that they can see the emails in transit if you’re not using encryption, but they never said they can’t. They are as secure as they can possibly be, and are honest about what’s secure and what’s not. I would leave Protonmail at the first sniff of trouble but I just haven’t seen anything that concerning.

        • cley_faye@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          2 days ago

          Now, Proton and various other “encrypted email” services then take that plaintext and encypt it with your public key, then store the ciphertext on their servers, and then they’re supposed to discard the plaintext, so that in case of a future court order, they wouldn’t have the plaintext anymore.

          You would not be able to retrieve your mails using IMAP from a regular mail client if they were doing that. You can even retrieve them from Gmail, which is unlikely to support any kind of “bring your own private key to decrypt mails from IMAP”.

      • cley_faye@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        6
        ·
        2 days ago

        Yes. They support IMAP. Which means, IMAP client can read your mails from the server. IMAP protocol does not support encryption, so any mail that does not add another layer of encryption (like GPG with encryption) implies that your mail is available in plaintext through IMAP, and as such, on the server.

        If that’s not enough, when you send a mail to a third party that just use plain, old regular mail, it is sent from their (proton’s) SMTP server, in plaintext. Again, unless you add a layer of encryption (assuming the recipient understands it, too), it’s plaintext. On the servers.

        Receiving is the same; if someone sends a mail to your proton address, is shows up in full plaintext on their SMTP server. Whatever they do after that (and we’ve established it’s not client-controlled encryption), they have access to it.

        In the case of GPG with encryption (not only for signature), then the message is encrypted everywhere (assuming your “sent” folder is configured properly). But that requires both you and the other party to support that, which have nothing to do with proton; you could as well do that over gmail.

        So, no, not a bold claim. The very basic of how emails standards works requires it.

        Now, I’m not saying that Proton have nefarious plans or anything. It is very possible that they act in good faith when they say they “don’t snoop”, and maybe they even have some proper monitoring so that admin have a somewhat hard time to check in the data without leaving a trace, but it’s 100% in clear up there as long as you’re not adding your own layer of encryption on top of it, and as such, you, as the user, have to be aware of that. It might be fully encrypted at rest to prevent a third party from fetching a drive and getting data, logs might be excessively scrubbed to remove all trace of from/to addresses (something very common in logs, for maintenance purpose), they might have built-in encryption in their own clients that implement gpg or anything between their users, and they might even do it properly with full client-side controlled keypairs, but the mail content? Have to be available, or the service could not operate.

        • DeathByBigSad@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          8
          ·
          2 days ago

          Protonmail does not support IMAP, what they have is a program called Proton Bridge that locally decrypts you email then you can set it up so that your IMAP client then reads from Proton Bridge, giving you a seamless experience with one email client having access to all your email accounts.

        • Encrypt-Keeper@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          ·
          edit-2
          2 days ago

          They support IMAP. Which means, IMAP client can read your mails from the server.

          Proton mail does not support IMAP. Because your emails are encrypted on the server.

          Again, unless you add a layer of encryption (assuming the recipient understands it, too), it’s plaintext. On the servers.

          Protonmail doesn’t claim that non-protonmail email is end to end encrypted. Any emails sent to a regular email without third party encryption will be plain text through the SMTP server, but they don’t store it. So in this case they are still not storing your emails in plaintext. Your recipient will, but that’s out of Protonmail’s control.

          shows up in full plaintext on their SMTP server. Whatever they do after that (and we’ve established it’s not client-controlled encryption), they have access to it.

          You’ve not established that at all. Protonmail stores that message with client side encryption and they have no access to it. Nothing you’ve brought up here suggests that anything is stored in plaintext on Protonmail servers.

          • cley_faye@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 days ago

            I’ll just repost the same message here, for completion sake.

            Well, I’ve been had. There is no IMAP support indeed, during my quick lookup around it, I ended up on a website that does look a lot like a real documentation that claim it does. My bad.

            The point about sending and receiving messages in cleartext stands, as SMTP works that way, but at rest it is possible they’re keeping them encrypted.

          • cley_faye@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 days ago

            Well, I’ve been had. There is no IMAP support indeed, during my quick lookup around it, I ended up on a website that does look a lot like a real documentation that claim it does. My bad.

            The point about sending and receiving messages in cleartext stands, as SMTP works that way, but at rest it is possible they’re keeping them encrypted.

    • pcrazee@feddit.org
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      2 days ago

      Proton has always been shitty. They don’t even give you the encryption keys. Always been a red flag for me.

      Not your keys, not your encryption.

      • Vinstaal0@feddit.nl
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        For most people, having access to their own encryption keys will cause for data loss.

        Most countries have systems in place that you can do proper audits on companies which you can trust. You can audit companies for securities or financial reports which are the most common once, but you can also audit a VPN if they keep logs or not (Pure VPN has done this) and you can audit them if they have access to your encryption keys or not.

        We really need to normalise that kind of control to keep companies in check.

  • Gaja0@lemmy.zip
    link
    fedilink
    English
    arrow-up
    30
    arrow-down
    10
    ·
    2 days ago

    I’m just saying Andy sucking up to Trump is a red flag. I’m cancelling in 2026 🫠

  • A_norny_mousse@feddit.org
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    6
    ·
    2 days ago

    For a critical blog, the first few paragraphs sound a lot like they’re shilling for Proton.

    I’m not sure if I’m supposed to be impressed by the author’s witty wording, but “the cool trick they do” is - full encryption.

    Moving on.

    But that’s misleading. The actual large language model is not open. The code for Proton’s bit of Lumo is not open source. The only open source bit that Proton’s made available is just some of Proton’s controls for the LLM. [GitHub]

    In the single most damning thing I can say about Proton in 2025, the Proton GitHub repository has a “cursorrules” file. They’re vibe-coding their public systems. Much secure!

    oof.

    Over the years I’ve heard many people claim that proton’s servers being in Switzerland is more secure than other EU countries - well there’s also this now:

    Proton is moving its servers out of Switzerland to another country in the EU they haven’t specified. The Lumo announcement is the first that Proton’s mentioned this.

    No company is safe from enshittification - always look for, and base your choices on, the legally binding stuff, before you commit. Be wary of weasel wording. And always, always be ready to move* on when the enshittification starts despite your caution.


    * regarding email, there’s redirection services a.k.a. eternal email addresses - in some cases run by venerable non-profits.

    • Tetsuo@jlai.lu
      link
      fedilink
      English
      arrow-up
      36
      ·
      2 days ago

      Regarding the fact that proton stops hosting in Switzerland : I thought it was because of new laws in Switzerland and that they hzf not much of a choice ?

      • DeathByBigSad@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        9
        ·
        2 days ago

        The law isn’t a law yet, its a just a proposal. Proton is still in Switzerland, but they said they’re gonna move if the surveillance law actually becomes law.

    • loudwhisper@infosec.pub
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      1
      ·
      2 days ago

      Over the years I’ve heard many people claim that proton’s servers being in Switzerland is more secure than other EU countries

      Things change. They are doing it because Switzerland is proposing legislation that would definitely make that claim untrue. Europe is no paradise, especially certain countries, but it still makes sense.

      From the lumo announcement:

      Lumo represents one of many investments Proton will be making before the end of the decade to ensure that Europe stays strong, independent, and technologically sovereign. Because of legal uncertainty around Swiss government proposals(new window) to introduce mass surveillance — proposals that have been outlawed in the EU — Proton is moving most of its physical infrastructure out of Switzerland. Lumo will be the first product to move.

      This shift represents an investment of over €100 million into the EU proper. While we do not give up the fight for privacy in Switzerland (and will continue to fight proposals that we believe will be extremely damaging to the Swiss economy), Proton is also embracing Europe and helping to develop a sovereign EuroStack(new window) for the future of our home continent. Lumo is European, and proudly so, and here to serve everybody who cares about privacy and security worldwide.

    • ItsComplicated@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      14
      ·
      2 days ago

      Switzerland has a surveillance law in the works that will force VPNs, messaging apps, and online platforms to log users’ identities, IP addresses, and metadata for government access

    • hansolo@lemmy.today
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      4
      ·
      2 days ago

      Really? This article reads like it’s AI slop reproducing Proton copy then pivoting to undermine them with straight up incorrect info.

      You know how Microsoft manages to make LibreOffice pulls errors on Windows 11? You really didn’t stop to think that Google might contract out some slop farms to shit on Proton?