• 3abas@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    6 hours ago

    You can run a model locally on your phone and it will answer most prompts without breaking a sweet, it’s actually way less energy than googling and loading the content from a website that’s hosted 24/7 just waiting for you to access the content.

    Training a model is expensive, using it isn’t.

    • bystander@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 hours ago

      I would like to learn about this a bit more, I keep hearing it in conversations here and there. Do you have links around studies/data on this?

    • Witziger_Waschbaer@feddit.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 hours ago

      Can you link me to what model you are talking about? I experimented with running some models on my server, but had a rather tough time without a GPU.