• @3abas@lemm.ee
    link
    fedilink
    English
    213 days ago

    You can run a model locally on your phone and it will answer most prompts without breaking a sweet, it’s actually way less energy than googling and loading the content from a website that’s hosted 24/7 just waiting for you to access the content.

    Training a model is expensive, using it isn’t.

    • @bystander@lemmy.ca
      link
      fedilink
      English
      413 days ago

      I would like to learn about this a bit more, I keep hearing it in conversations here and there. Do you have links around studies/data on this?

    • @Witziger_Waschbaer@feddit.org
      link
      fedilink
      English
      413 days ago

      Can you link me to what model you are talking about? I experimented with running some models on my server, but had a rather tough time without a GPU.

    • @squaresinger@lemmy.world
      link
      fedilink
      English
      313 days ago

      Nice claim you have there. Do you have anything to back that up?

      If it’s so easy, it shouldn’t be hard for you to link a model like that.