Github Repo

Open Web UI is a user-friendly frontend for self-hosted models and it works really nicely with ollama.

Ollama API

You can use Open Web UI as a proxy service for an Ollama instance which is nice if you want to expose it to the web but ensure that authentication is provided.

Set your baseUrl to https://webui.example.com/ollama and add authentication header with an API token that you have generated through the app: Authentication: Bearer {API_TOKEN}.

Here is an example using langchain:

import os
from langchain_community.llms import Ollama
 
ollama = Ollama(model="llama2", 
  base_url="https://webui.example.com/ollama",
  headers={
  "Authorization": f"Bearer {os.getenv('OLLAMA_TOKEN')}"
  })
 
ollama.invoke(...)