Awesome! One cool thing you can try is to install "Open WebUI" alongside it. It gives you a smooth locally hosted web interface to interact with it through, saving your chats etc.
Open WebUI is an extensible, feature-rich, and user-friendly self-hosted WebUI designed to operate entirely offline. It supports various LLM runners, including Ollama and OpenAI-compatible APIs.
https://github.com/open-webui/open-webui
Already using it!