Simulacra is a platform for building LLM powered Telegram bots with an advanced template-based context system.
This project was created for personal experimentation, but is available for anyone to use.
For Docker specific usage, see the Docker section.
uv syncIf you wish to include development dependencies, add --dev.
Modify the example configuration file example/config.toml with your TELEGRAM_API_TOKEN and TELEGRAM_USERNAME.
- Interact with @BotFather to create a new bot and get its API token.
For more information, see the Configuration section.
uv run app.py examples/config.tomlSend a message to your bot and it will respond. Bots can also see and understand images, if the model supports this.
Send /help to see a list of commands:
Actions
/new - Start a new conversation
/retry - Retry the last response
/undo - Undo the last exchange
/clear - Clear the conversation
/continue - Request another response
/instruct (...) - Apply an instruction
/syncbook (...) - Sync current book position
Information
/stats - Show conversation statistics
/help - Show this help message
cli.py returns a single response from a character without running the Telegram bot or persisting a conversation:
uv run cli.py example/context.yml "Hello"The prompt can also be piped via stdin. Use --set key=value (repeatable, dotted keys nest) to override context values for the run:
uv run cli.py example/context.yml "Hello" \
--set api_params.temperature=0.7The application is configured by a TOML config file, which initializes one or more Telegram bots and defines the path to their YAML context files.
The config TOML file initializes one or more Telegram bots and defines the path to their context files.
See example/config.toml for a template config file:
[[simulacra]]
context_filepath = "example/context.yml"
telegram_token = "telegram-bot-token"
authorized_user = "@telegram-username"
[[simulacra]] # Second bot configuration
context_filepath = "example/second_bot_context.yml"
telegram_token = "second-telegram-bot-token"
authorized_user = "@telegram-username"The context file is a YAML file that defines bot configuration and state.
A context file contains the following keys:
| Key | Description |
|---|---|
character_name |
The bot's character name |
conversation_file |
Relative file link to the conversation file (auto-generated) |
api_params |
API configuration object |
├─ model |
The model to use for the API |
└─ <key> |
Additional API parameters (e.g. temperature, max_tokens) |
vars |
Template variables object |
├─ system_prompt |
The bot's system prompt |
└─ <key> |
Additional template variables |
Conversations are stored separately in a conversations/ directory. Changes to the context file take effect immediately.
This project publishes a Docker image to GHCR ghcr.io/njbbaer/simulacra.
Configure your container with the following:
- Mount a directory containing your config and context files to
/config. - Set the path to your config file in the environment as
CONFIG_FILEPATH. - Set your OpenRouter API key in the environment as
OPENROUTER_API_KEY. - Optionally set
TELEGRAM_BOT_APIto a local Telegram Bot API server URL for lower latency and larger file handling.
Ensure the context file paths in your config are accessible within the container (i.e. /config).
docker run --name simulacra \
--volume /var/lib/simulacra:/config \
--env OPENROUTER_API_KEY=your_openai_api_key \
--env CONFIG_FILEPATH=/config/config.toml \
--env TELEGRAM_BOT_API=http://telegram-bot-api:8081 \
--restart unless-stopped \
ghcr.io/njbbaer/simulacra:latestservices:
simulacra:
image: ghcr.io/njbbaer/simulacra:latest
container_name: simulacra
volumes:
- /var/lib/simulacra:/config
environment:
- OPENROUTER_API_KEY={{ your_openai_api_key }}
- CONFIG_FILEPATH=/config/config.toml
- TELEGRAM_BOT_API=http://telegram-bot-api:8081
restart: unless-stoppedEnable code reloading with development mode. Create a .env file or add the following to your environment:
export ENVIRONMENT=developmentNote: Development mode can only run a single bot.
A docker-compose.yml is included for running a local Telegram Bot API server. Set TELEGRAM_API_ID and TELEGRAM_API_HASH in your .env, then:
docker compose up -dGrant your user read access to files created by the container:
sudo setfacl -R -m u:$(whoami):rX -d -m u:$(whoami):rX /var/lib/telegram-bot-api/Install pre-commit hooks before committing code:
uv run pre-commit installmake lintmake testThe release script sets the new version in pyproject.toml, commits it, and pushes a tag.
A release is performed by GitHub Actions when the tag is pushed.
make release type=<major|minor|patch>