Configuration

Profiles, system prompts, file sandboxing, and chat parameter tuning — all in one place.

settings Opening Settings

Click the Settings gear icon (⚙) in the top-right toolbar to open the Settings dialog. It contains four tabs:

👤
Profiles
Manage personas and profile-level config
🔧
General
App-wide settings and Ollama endpoint
🎛
Defaults
Default chat parameter presets
🔌
Tools
Manage built-in and custom tools

person Profiles

Profiles are saved configurations you can switch between instantly. Each profile has its own:

Setting Description
name Display name shown in the profile switcher
model Default Ollama model to preselect for this profile
systemPrompt The system message sent to the model every conversation
allowedDirs Folders the AI can read/write (see File Access below)
chatParams Default chat parameter preset for new conversations
Managing profiles
add_circle Click New Profile to create one from scratch or copy an existing profile
edit Click a profile name to edit its settings inline
swap_horiz Switch active profile from the dropdown in the top toolbar
delete Delete profiles (the default profile cannot be deleted)

description System Prompt

The system prompt is a hidden instruction sent to the model before your first message in every new conversation. Use it to set the AI's role, tone, response language, or any persistent instructions.

Example: Developer profile system prompt
You are an expert software engineer assistant. Reply with code-first answers. Use fenced code blocks with the correct language identifier. Prefer concise, idiomatic solutions. When asked to write a file, always use the write_file tool with a complete absolute path.
tips_and_updates
The system prompt can also be overridden per conversation using the Advanced chat parameters panel — useful for one-off persona changes without touching your profile.

folder_managed File Access (Allowed Directories)

LexiChat restricts file tools to a whitelist of directories. The AI cannot read or write files outside these paths — this is enforced in the Rust backend, not just the prompt.

Adding a directory
  1. Open Settings → select your profile
  2. Under Allowed Directories, click Add folder
  3. A system folder picker opens — navigate to the desired folder and confirm
  4. The path appears in the list; click × to remove it later
warning
Avoid adding high-level paths like your home directory (~) to the allowed list. Instead, add specific working folders (e.g. ~/Documents/LexiChat) to minimise the blast radius of accidental writes.

tune Chat Parameters (Per-Conversation)

The sliders icon next to the attach button opens a quick-access panel for adjusting the current conversation's behaviour. Changes apply immediately to the next message you send and reset when you start a new chat.

psychology
Response Style
Precise
Low temperature (0.2). Best for facts, code, analysis — minimal creativity or variation.
Balanced ✓ default
Model's own default temperature. Good for general-purpose conversations.
Creative
High temperature (1.0). More varied, imaginative — best for brainstorming and writing.
short_text
Response Length
Short
256 tokens max
Medium
1024 tokens max
Long
4096 tokens max
Auto ✓
Model decides
memory
Memory (Context Window)
Standard ✓ default
Uses the model's native context window. Recommended — avoids accidental truncation of large context models.
Extended
Forces 8 192 token context. Use if the model's default is unexpectedly short.

science Advanced Parameters

Click "Advanced settings…" at the bottom of the chat params panel to access raw model parameters. These map directly to the Ollama API's options object.

Parameter Range What it does
temperature 0–2 Controls randomness. Lower = more deterministic, higher = more creative.
top_p 0–1 Nucleus sampling. Limits token selection to the top-P probability mass.
top_k 1–100 Limits selection to the top K most likely tokens at each step.
repeat_penalty 0.5–2 Penalises recently used tokens to reduce repetition.
seed integer Fix a seed for reproducible outputs. Leave blank for random.
num_ctx tokens Override the context window size in tokens.
num_predict tokens Maximum number of tokens to generate in the response.
system_prompt_override text Replace the profile system prompt just for this conversation.
keep_alive e.g. "5m" How long Ollama keeps the model loaded after the last request.

bookmarks Default Overrides

Go to Settings → Defaults to set the chat parameter defaults that apply when you start a new conversation (or when a profile doesn't specify its own). These are the same controls as the per-conversation panel, but they're saved persistently and used as the starting point for every new chat.

info
Priority order: Per-conversation params (highest) → Profile defaults → App defaults (lowest). If a profile doesn't set a default, the app-level default is used.