minus-squareAshley@lemmy.catoLinux@lemmy.ml•Running Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinelinkfedilinkarrow-up2·21 hours agoAlpaca is great, I can even run it on my oneplus 6t, albeit slowly and the max size I got running was llama 7b linkfedilink
Alpaca is great, I can even run it on my oneplus 6t, albeit slowly and the max size I got running was llama 7b