Skip to content

prokopsafranek/ask

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MiniLLM Client

MiniLLM Client is an interactive, browser-based LLM playground with full chat features. Everything runs locally in your browser using localStorage — no backend required.

Features

  • Chat with multiple LLM models (ChatGPT, LM Studio, Ollama, etc.)
  • Chat history saved in localStorage
  • Rename or delete chats
  • Pop-up windows for uploaded or generated media
  • Markdown rendering for code and formatted text
  • Copy prompts and responses with one click
  • Regenerate responses instantly
  • Lightweight and fast — works in any modern browser

Demo

Check it out live: [https://mini-llm.pages.dev]

How to Use

  1. Open the demo in your browser
  2. Type a prompt in the input box
  3. Select the model if desired
  4. Click "Generate" to see AI response
  5. Interact with chat history, media pop-ups, copy responses, or regenerate answers

Why This Project?

MiniLLM Client is perfect for anyone curious about AI. It’s simple, interactive, and fun. You can experiment with multiple models, keep your chats organized, and see responses instantly — all without installing anything.

Releases

No releases published

Packages

 
 
 

Contributors