• @drwankingstein@lemmy.dbzer0.com
    link
    fedilink
    English
    414 days ago

    I don’t even think this is the case, google does a lot pretty much everywhere. one example is one of the things they are pushing for is locally run AI (gemini, stable diffusion etc.) to run on your gpu via webgpu instead of needing to use cloud services, which is obviously privacy friendly for a myriad of reasons, in fact, we now have multiple implementations of LLMs that run locally in browser on webgpu, and even a stable diffusion implementation (never got it to work though since my most beefy gpu is an arc a380 with 6gb of ram)

    they do other stuff too, but with the recent craze push for AI, I think this is probably the most relevant.

    • LeafletOP
      link
      fedilink
      English
      614 days ago

      LLMs are expensive to run, so locally running them saves Google money.

        • LeafletOP
          link
          fedilink
          English
          1
          edit-2
          14 days ago

          There’s nothing technical stopping Google from sending the prompt text (and maybe generated results) back to their servers. Only political/social backlash for worsened privacy.