6. api.py

api.py Breakdown

Wrapper module for interacting with Ollama and Open WebUI services. Handles prompt generation, WebUI messaging, and file uploads to knowledge bases.


1.  query_ollama(prompt: str, model: str = "llama3") -> str

def query_ollama(prompt: str, model: str = "llama3") -> str:
Purpose Sends a text prompt to Ollama and retrieves the model’s response
Endpoint ${OLLAMA_URL}/api/generate
Payload
Return Response string or error message

2. query_open_webui(query: str, model: str) -> Union[Dict, str]

def query_open_webui(query: str, model: str) -> Union[Dict, str]:
Purpose Sends a chat query to Open WebUI using the specified model
Endpoint ${OPEN_WEBUI_URL}/api/chat
Payload Standard ChatML-style structure
Return Full JSON response or error string

Payload Example

{
  "messages": [
    {"role": "user", "content": "What is RAG?"}
  ],
  "model": "llama3"
}

3. `upload_file_to_knowledge(knowledge_id: str, filepath: str) -> Union[Dict, str]

def upload_file_to_knowledge(knowledge_id: str, filepath: str) -> Union[Dict, str]:
Purpose Uploads a document to a specified Open WebUI knowledge base
Endpoint ${OPEN_WEBUI_URL}/api/knowledge/<knowledge_id>/documents
Method Multipart file upload (files={...})
Return Success response or error message

4. Shared Settings

DEFAULT_TIMEOUT = 10  # seconds

Summary Table

Function Purpose Endpoint Return
query_ollama Send prompt to Ollama LLM /api/generate str
query_open_webui Send message to WebUI chat /api/chat dict or str
upload_file_to_knowledge Upload file to KB in WebUI /api/knowledge/... dict or str