I got "Open WebUI with Bundled Ollama Support" installed and its working like a dream (https://github.com/open-webui/open-webui) as docker. It works! Llama 3.2 (ollama).
Now some "newbie" questions.....
1) GPU worked at first. The morning after it kind of stopped - and only running on CPU. Any suggestions?
2) Is there a config file to setup "stuff"?
3) I cant seem to communicate through CURL (POST is all bad. GET may response, when it does it says "Method not allowed". I have seem so many basic examples on curl (JSON), but I just dont get a response. When I do. I get a very long HTML-response.
I have searched but can't seem to find "old school" config-files which may (or may not help me).
Hope you can help me......🙂
Cheers