https://github.com/open-webui/open-webui/blob/main/docs/apache.md
On your latest installation of Ollama, make sure that you have setup your api server from the official Ollama reference:
The guide doesn't seem to match the current updated service file on linux. So, we will address it here:
Unless when you're compiling Ollama from source, installing with the standard install curl https://ollama.com/install.sh | sh
creates a file called ollama.service
in /etc/systemd/system. You can use nano to edit the file:
sudo nano /etc/systemd/system/ollama.service
Add the following lines:
Environment="OLLAMA_HOST=0.0.0.0:11434" # this line is mandatory. You can also specify
For instance:
[Unit] Description=Ollama Service After=network-online.target [Service] ExecStart=/usr/local/bin/ollama serve Environment="OLLAMA_HOST=0.0.0.0:11434" # this line is mandatory. You can also specify 192.168.254.109:DIFFERENT_PORT, format Environment="OLLAMA_ORIGINS=http://192.168.254.106:11434,https://models.server.city" # this line is optional User=ollama Group=ollama Restart=always RestartSec=3 Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/s> [Install] WantedBy=default.target
Save the file by pressing CTRL+S, then press CTRL+X
When your computer restarts, the Ollama server will now be listening on the IP:PORT you specified, in this case 0.0.0.0:11434, or 192.168.254.106:11434 (whatever your local IP address is). Make sure that your router is correctly configured to serve pages from that local IP by forwarding 11434 to your local IP server.
推荐本站淘宝优惠价购买喜欢的宝贝:
本文链接:https://hqyman.cn/post/9547.html 非本站原创文章欢迎转载,原创文章需保留本站地址!
休息一下~~