minus-squareMIXEDUNIVERS@discuss.tchncs.detoLinux@lemmy.ml•Running Generative AI Models Locally with Ollama and Open WebUI - Fedora MagazinelinkfedilinkDeutscharrow-up3·2 days agoI did try to use it on Fedora but i have a Radeon 6700 XT and it only worked in the CPU. I wait until ROCM official support reaches my older Model. linkfedilink
I did try to use it on Fedora but i have a Radeon 6700 XT and it only worked in the CPU. I wait until ROCM official support reaches my older Model.