minus-squarelelgenio@lemmy.mltoLinux@lemmy.ml•Running Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinelinkfedilinkarrow-up3·2 days agoollam runs on the 6700 XT, but you need to add an environment variable for it to work… I just don’t remember what it was and am away from my computer right now linkfedilink
ollam runs on the 6700 XT, but you need to add an environment variable for it to work… I just don’t remember what it was and am away from my computer right now