dino@discuss.tchncs.detoLinux@lemmy.ml•Running Generative AI Models Locally with Ollama and Open WebUI - Fedora MagazineEnglish
10·
2 days agoI tried it briefly, but its hot garbage if you dont have potent hardware.
The amount of iterations you have to do, to get proper answers and the time it takes to produce them is a waste of time.
strg + r