Chat with AI. No accounts. No cloud. Your data stays on your device.
Works in your browser. One click. That's it.
Codex Agent Runner is a beautiful chat interface that lets you talk to AI — completely privately, right on your own computer. No API keys, no subscriptions, no data sent to the cloud.
Just you and your AI, having a conversation.
👉 Download Ollama at ollama.ai
Ollama is a free tool that runs AI models on your computer. It works on Mac, Windows, and Linux.
After installing Ollama, open your Terminal (Mac/Linux) or Command Prompt (Windows) and run:
ollama pull llama3
That's the only command you'll ever need.
The page will automatically detect your local Ollama and you're ready to chat!
Just type naturally! You can also address specific AI personas:
| Type this… | What happens |
|---|---|
Hello, how are you? |
Chat directly with the AI |
@ollama explain black holes |
Talk to Ollama |
@copilot write me a Python function |
Talk to Copilot persona |
@lucidia tell me a story |
Talk to Lucidia persona |
@blackboxprogramming |
Talk to the BlackRoad AI |
All of these talk to your local Ollama — nothing is sent to any external server.
- ✅ Fully offline — your conversations never leave your machine
- ✅ No account needed — zero sign-up, zero tracking
- ✅ Free forever — no subscriptions or API costs
- ✅ Open source — see exactly what runs on your machine
Want to use the API in your own project?
import { ollamaChat, parseHandle } from './ollama.js';
const { handle, prompt } = parseHandle('@lucidia explain quantum entanglement');
await ollamaChat({
model: 'llama3',
messages: [{ role: 'user', content: prompt }],
onChunk: (text) => process.stdout.write(text),
onDone: () => console.log('\n'),
onError: (err) => console.error(err.message),
});See ollama.js for the full API.
- Ollama shows "offline"? Make sure Ollama is running — open the Ollama app or run
ollama servein your terminal - No models available? Run
ollama pull llama3in your terminal - Still stuck? Open an issue and we'll help!