• There’s a lot of explaining to do for Meta, OpenAI, Claude and Google gemini to justify overpaying for their models now that there’s l a literal open source model that can do the basics.

    • Zement
      link
      fedilink
      15 months ago

      Yes GPT4All of you want to try for yourself without coding know how.

    • suokoOP
      link
      fedilink
      15 months ago

      I’m testing right now vscode+continue+ollama+gwen2.5-coder. With a simple GPU it’s already OK.

    • suokoOP
      link
      fedilink
      05 months ago

      You still need an expensive hardware to run it. Unless myceliumwebserver project will start

        • @Scipitie@lemmy.dbzer0.com
          link
          fedilink
          15 months ago

          How much vram does your TI pack? Is that the standard 8gb ddr6?

          I will because I’m surprised and impressed that a 14b model runs smoothly.

          Thanks for the insights!

          • @birdcat@lemmy.ml
            link
            fedilink
            25 months ago

            i dont even have a GPU and the 14b model runs at an acceptable speed. but yes, faster and bigger would be nice… or knowing how to distill the biggest one, cuz I only use it for something very specific.