Getting Started
Everything you need from zero to your first local AI conversation — takes about five minutes.
checklist Requirements
download Step 1 — Install Ollama
LexiChat uses Ollama as its local model server. Ollama manages downloads, GPU acceleration, and API serving automatically.
# macOS / Linux — one-liner installer
curl -fsSL https://ollama.com/install.sh | sh
psychology Step 2 — Pull a Model
Once Ollama is running, pull a model. We recommend starting with gemma3 — it's fast, capable, and supports tool calling.
# Recommended starting model (~5 GB)
ollama pull gemma3
# Larger, more capable (~27 GB — needs 32 GB RAM)
ollama pull gemma3:27b
# Tiny model for low-RAM machines (~2 GB)
ollama pull qwen2.5:3b
| Model | Size | RAM needed | Best for |
|---|---|---|---|
| gemma3 | ~5 GB | 8 GB | General use, tools, writing |
| gemma3:27b | ~27 GB | 32 GB | Complex reasoning, coding |
| qwen2.5:3b | ~2 GB | 4 GB | Low RAM, quick tasks |
| llava | ~4 GB | 8 GB | Vision / image analysis |
install_desktop Step 3 — Install LexiChat
Download the latest release for your platform from lexi-chat.com. All platforms and formats are listed there with SHA256 checksums.
- Download LexiChat.dmg from Releases
- Open the DMG and drag to Applications
- Right-click → Open on first launch (Gatekeeper)
- Download LexiChat_setup.exe from Releases
- Run the installer and follow prompts
- LexiChat appears in Start Menu
chat Step 4 — Your First Chat
Select a model
In the model picker at the bottom of the chat, choose the model you pulled (e.g. gemma3). LexiChat shows all locally available Ollama models.
Type a message
Click the input box at the bottom, type your question, and press Enter or click Send. Responses stream in real time.
Try an image
Click the paperclip icon to attach an image. With a vision-capable model like llava, the AI can describe or analyse it.
Ask it to search the web
Try: "What happened in the news today?" — the AI will call the built-in DuckDuckGo web search tool automatically and show you a ⚙ web_search indicator as it works.