MiniLLM Client is an interactive, browser-based LLM playground with full chat features. Everything runs locally in your browser using localStorage — no backend required.
- Chat with multiple LLM models (ChatGPT, LM Studio, Ollama, etc.)
- Chat history saved in localStorage
- Rename or delete chats
- Pop-up windows for uploaded or generated media
- Markdown rendering for code and formatted text
- Copy prompts and responses with one click
- Regenerate responses instantly
- Lightweight and fast — works in any modern browser
Check it out live: [https://mini-llm.pages.dev]
- Open the demo in your browser
- Type a prompt in the input box
- Select the model if desired
- Click "Generate" to see AI response
- Interact with chat history, media pop-ups, copy responses, or regenerate answers
MiniLLM Client is perfect for anyone curious about AI. It’s simple, interactive, and fun. You can experiment with multiple models, keep your chats organized, and see responses instantly — all without installing anything.