Github Repo

Open Web UI is a user-friendly frontend for self-hosted models and it works really nicely with ollama.

Ollama API

You can use Open Web UI as a proxy service for an Ollama instance which is nice if you want to expose it to the web but ensure that authentication is provided.

Set your baseUrl to https://webui.example.com/ollama and add authentication header with an API token that you have generated through the app: Authentication: Bearer {API_TOKEN}.

Here is an example using Langchain:

import os
from langchain_community.llms import Ollama
 
ollama = Ollama(model="llama2", 
  base_url="https://webui.example.com/ollama",
  headers={
  "Authorization": f"Bearer {os.getenv('OLLAMA_TOKEN')}"
  })
 
ollama.invoke(...)

LiteLLM Integration

Open Web UI used to provide a self-contained LiteLLM instance which allowed you to connect to external models. However, this was removed to simplify the platform and they now recommend that you configure your own instance.

Connecting to an External LiteLLM Instance

Go to Profile > Settings > Admin Panel > Connections. In OpenAPI Settings enter the URL to your litellm plus /v1 (e.g. if you’re running it on the same machine it might be http://localhost:4000/v1) instance and the LITELLM_MASTER_KEY value.

See Complete Self Hosted AI Setup.