A powerful log monitoring and analysis application that collects logs from Linux servers (via rsyslog) and Docker containers, analyzes them using local AI (Ollama), and sends intelligent alerts via Telegram.
Practical and easy to deploy and operate.
- Features
- Architecture
- Quick Start
- Docker Compose
- Manual Installation
- Configuration
- Usage
- Troubleshooting
- Contributing
- License
| Dark Theme | Lite Theme |
|---|---|
![]() |
![]() |
| AI Analyzer | AI Analyzer |
- 📊 Dashboard - Real-time overview of log statistics and system health
- 📝 Log Collection - Collect logs from rsyslog (UDP/TCP) and Docker containers
- 🤖 AI Analysis - Analyze logs using local Ollama AI for intelligent insights
- 🔔 Smart Alerts - Create filters to detect specific patterns and receive Telegram notifications
- 🐳 Docker Integration - Auto-discover and monitor Docker container logs
- 💬 AI Chat - Interactive chat assistant for log troubleshooting
- 🎨 Modern UI - Clean, responsive interface inspired by oVirt/Foreman
┌─────────────────┐ ┌─────────────────┐
│ Linux Servers │────▶│ Syslog UDP │
│ (rsyslog) │ │ Port 5514 │
└─────────────────┘ └────────┬────────┘
│
┌─────────────────┐ ┌────────▼────────┐ ┌─────────────────┐
│ Docker │────▶│ LogAI │────▶│ Redis │
│ Containers │ │ Monitor │ │ (Storage) │
└─────────────────┘ └────────┬────────┘ └─────────────────┘
│
┌────────▼────────┐ ┌─────────────────┐
│ Ollama AI │────▶│ Telegram │
│ (Analysis) │ │ (Alerts) │
└─────────────────┘ └─────────────────┘
- Clone the repository:
git clone https://github.com/yourusername/logaimonitor.git
cd logaimonitor- Copy the example Compose file and edit it (or use the web UI later to change settings):
cp docker-compose.example.yml docker-compose.yml
# Edit `docker-compose.yml` to set required environment variables (e.g. SECRET_KEY, OLLAMA_HOST, TELEGRAM_BOT_TOKEN, TELEGRAM_CHAT_ID),
# or leave them as defaults and change them later via the web UI in Settings.- Start the application:
# Using Docker Compose v2
docker compose up -d
# Or with the older standalone docker-compose:
docker-compose up -dDefault Credentials: Username: admin / Password: admin
- Access the web interface at
http://localhost:5059
- Install dependencies:
pip install -r requirements.txt- Start Redis:
docker run -d --name redis -p 6379:6379 redis:7-alpine- Install and start Ollama:
# Install Ollama
curl -fsSL https://ollama.ai/install.sh | sh
# Pull a model
ollama pull llama3.2
# Start Ollama server
ollama serve- Run the application:
python app.py| Variable | Description | Default |
|---|---|---|
SECRET_KEY |
Flask secret key | change-this |
REDIS_HOST |
Redis hostname | localhost |
REDIS_PORT |
Redis port | 6379 |
OLLAMA_HOST |
Ollama API URL | http://localhost:11434 |
OLLAMA_MODEL |
Ollama model name | llama3.2 |
TELEGRAM_BOT_TOKEN |
Telegram bot token | - |
TELEGRAM_CHAT_ID |
Telegram chat ID | - |
LOG_RETENTION_HOURS |
Log retention period | 12 (12 hours) |
ANALYSIS_INTERVAL_SECONDS |
Auto-analysis interval | 300 |
On your Linux servers, add this configuration to /etc/rsyslog.d/99-logaimonitor.conf:
# Forward all logs via UDP
*.* @logaimonitor-host:5514
# Or via TCP (more reliable)
*.* @@logaimonitor-host:5515Then restart rsyslog:
sudo systemctl restart rsyslogIf you only want to forward certain log types:
# Only auth/security logs
auth,authpriv.* @logaimonitor-host:5514
# Only errors and above
*.err @logaimonitor-host:5514
# Kernel messages
kern.* @logaimonitor-host:5514Send a test log immediately:
logger -n logaimonitor-host -P 5514 -d "Test message from server"For Docker containers running on external/remote hosts, you have several options:
On the remote Docker host, configure containers to send logs via syslog:
# Run containers with syslog driver
docker run -d \
--log-driver=syslog \
--log-opt syslog-address=udp://logaimonitor-host:5514 \
--log-opt tag="{{.Name}}" \
your-imageOr set as the default for all containers in /etc/docker/daemon.json:
{
"log-driver": "syslog",
"log-opts": {
"syslog-address": "udp://logaimonitor-host:5514",
"tag": "{{.Name}}"
}
}Then restart Docker:
sudo systemctl restart dockerOn the remote host, edit /etc/docker/daemon.json:
{
"hosts": ["unix:///var/run/docker.sock", "tcp://0.0.0.0:2375"]
}Then on LogAI Monitor, set the environment variable:
DOCKER_SOCKET=tcp://remote-host:2375Install rsyslog on the remote Docker host and configure journald forwarding:
# /etc/rsyslog.d/99-docker-forward.conf
module(load="imjournal")
:programname, startswith, "docker" @logaimonitor-host:5514Recommendation: Option 1 (syslog driver) is the easiest and most secure - no extra configuration on LogAI Monitor needed, logs appear as syslog entries.
- Create a bot with @BotFather on Telegram
- Copy the bot token
- Send a message to your bot
- Get your chat ID from
https://api.telegram.org/bot<TOKEN>/getUpdates - Configure in Settings or via environment variables
Filters allow you to monitor specific log patterns:
- Go to Filters in the sidebar
- Click Create Filter
- Configure conditions:
- Severity: Match specific severity levels
- Source Contains: Match logs from specific sources
- Message Contains: Match logs containing specific text
- Message Regex: Advanced pattern matching
- Enable Telegram notification if desired
- Save the filter
- Go to AI Analysis in the sidebar
- Click Analyze Recent Logs for batch analysis
- Use the Chat Assistant to ask questions about your logs
- Click on any log entry and use Analyze with AI for detailed analysis
- Go to Docker Containers in the sidebar
- View all running containers
- Click Logs to view container logs
- Logs are automatically collected and analyzed
GET /api/logs- Get logs with filteringGET /api/logs/<id>- Get single logPOST /api/logs/ingest- Ingest log via HTTP
GET /api/filters- List all filtersPOST /api/filters- Create filterPUT /api/filters/<id>- Update filterDELETE /api/filters/<id>- Delete filter
GET /api/alerts- List alertsPOST /api/alerts/<id>/acknowledge- Acknowledge alert
GET /api/ollama/status- Check Ollama statusPOST /api/ollama/analyze- Analyze logsPOST /api/ollama/chat- Chat with AI
GET /api/settings- Get settingsPOST /api/settings- Save settingsPOST /api/telegram/test- Test Telegram connection
| Port | Protocol | Description |
|---|---|---|
| 5059 | TCP | Web interface |
| 5514 | UDP | Syslog (UDP) |
| 5515 | TCP | Syslog (TCP) |
- Backend: Python, Flask, Flask-SocketIO
- Storage: Redis
- AI: Ollama (local LLM)
- Notifications: Telegram Bot API
- Frontend: HTML, CSS, JavaScript
- Deployment: Docker, Docker Compose
- Check rsyslog configuration on source servers
- Verify network connectivity (ports 5514/5515)
- Check firewall rules
- View LogAI Monitor logs:
docker-compose logs -f logaimonitor
- Verify Ollama is running:
curl http://localhost:11434/api/tags - Check the model is pulled:
ollama list - Verify
OLLAMA_HOSTenvironment variable
- Verify bot token is correct
- Check chat ID (must start a conversation with bot first)
- Use "Test Connection" in Settings
Contributions are welcome! Please read our contributing guidelines and submit pull requests.
GPL-3.0 License - see LICENSE file for details.
Copyright (C) 2026 Fotios Tsiadimos
- Ollama - Local AI inference
- Flask - Web framework
- Redis - In-memory data store
- Font Awesome - Icons

