You are viewing a single comment's thread from:

RE: LeoThread 2024-11-09 12:58

in LeoFinance2 months ago

Awesome! One cool thing you can try is to install "Open WebUI" alongside it. It gives you a smooth locally hosted web interface to interact with it through, saving your chats etc.

Open WebUI is an extensible, feature-rich, and user-friendly self-hosted WebUI designed to operate entirely offline. It supports various LLM runners, including Ollama and OpenAI-compatible APIs.

https://github.com/open-webui/open-webui