• pezhore@infosec.pub
    link
    fedilink
    arrow-up
    6
    ·
    2 days ago

    That’s why I’ve stopped using non-local LLMs. Ollama works just fine on my outdated GTX 2060.