• @lepinkainen@lemmy.world
    link
    fedilink
    31 month ago

    The don’t “ChatGPT”, you can use local models and they cost next to nothing on Meta’s or Google’s scale. Both run their own servers for it.

    • stebo
      link
      fedilink
      21 month ago

      of course they don’t use chatgpt and whatever they use isn’t comparable to chatgpt cuz that would be unsustainable

      • @lepinkainen@lemmy.world
        link
        fedilink
        11 month ago

        “Isn’t comparable”? For generic tasks that’s true.

        Figuring out shittily censored words from pictures and subtitles? The custom models are even better