Cross-platform app to chat with your own Ollama server.
Camel Chat
Camel Chat is a feature-rich Flutter application designed to provide a seamless interface for communicating with large language models (LLMs) served via an Ollama server. It offers a user-friendly way to interact with open-source AI models on your own hardware.
Features
Connect to Ollama Servers: Easily connect to any Ollama server with optional basic HTTP authentication.
Multiple Model Support: Chat with any model available on your Ollama server.
Complete Chat History: View and manage your conversation history.
Dark Mode Support: Switch between light and dark themes for comfortable viewing.
Custom System Prompts: Define system prompts to set the AI's behaviour and context.
Export Conversations: Export your chats as markdown files for sharing or archiving.
Chat Organisation: Auto-generated meaningful titles for your conversations.
Responsive UI: Works seamlessly on both mobile and desktop devices.
Code Formatting: Proper rendering and formatting of code blocks in responses.
Local Storage: All your conversations are stored locally for privacy.
Getting Started
Prerequisites
A running Ollama server (local or remote).
Installation
Android
Download and install the APK from the releases page.
Linux
Choose one of the following packages from the releases page:
Debian/Ubuntu: Download and install the .deb package.
Fedora/RHEL: Download and install the .rpm package.
Arch: Download and install .zst package.
Other distributions: Download the AppImage, make it executable and run it.
My app is nothing compared to the features Open WebUI. I just wanted to make a simple native app. Honestly, I made this just because I wanted to see if I can make something like that.
Also, Open WebUI is slightly complex for someone who is not into self-hosting. My app is for someone who just installs Ollama on their laptop or any computer and has exposed it to on the local network.