• @lepinkainen@lemmy.world
    link
    fedilink
    312 days ago

    The don’t “ChatGPT”, you can use local models and they cost next to nothing on Meta’s or Google’s scale. Both run their own servers for it.

    • stebo
      link
      fedilink
      212 days ago

      of course they don’t use chatgpt and whatever they use isn’t comparable to chatgpt cuz that would be unsustainable

      • @lepinkainen@lemmy.world
        link
        fedilink
        112 days ago

        “Isn’t comparable”? For generic tasks that’s true.

        Figuring out shittily censored words from pictures and subtitles? The custom models are even better