LocalLLaMA @sh.itjust.works HumanPerson @sh.itjust.works 9mo ago Fixed it Seriously though, does anyone know how to use openwebui with the new version?Edit: if you go into the ollama container using sudo docker exec -it bash, then you can pull models with ollama pull llama3.1:8b for example and have it.
Seriously though, does anyone know how to use openwebui with the new version?Edit: if you go into the ollama container using sudo docker exec -it bash, then you can pull models with ollama pull llama3.1:8b for example and have it.