What are your thoughts on #privacy and #itsecurity regarding the #LocalLLMs you use? They seem to be an alternative to ChatGPT, MS Copilot etc. which basically are creepy privacy black boxes. How can you be sure that local LLMs do not A) “phone home” or B) create a profile on you, C) that their analysis is restricted to the scope of your terminal? As far as I can see #ollama and #lmstudio do not provide privacy statements.

  • blackboxwarrior@lemmy.ml
    link
    fedilink
    arrow-up
    6
    ·
    1 day ago

    Ollama used to be Facebook’s proprietary model

    Just to be clear, llama is the facebook model, ollama is the software that lets you run llama (along with many other models.

    Ollama has internet access (otherwise how could it download models?), the only true privacy solution is to run in a container with no internet access after downloading models, or air gap your computer.