• amzd
    link
    fedilink
    41 year ago

    If you have a GPU in your pc it’s almost always faster to just run your own llm locally. And you won’t have this issue. Search for ollama.