This document summarizes the analysis, improvements, and Graylog integration implemented for the LoggerPlusPlus project.
- Location:
ElasticExporter.java:221 - Issue: Size check
pendingEntries.size() == 0is outside synchronized block - Location:
ElasticExporter.java:179 - Issue:
shutdown()setspendingEntries = nullwithout synchronization - Impact: Race conditions between export operations and queue processing
- Location:
ElasticExporter.java:256-261 - Issue: After 5 connection failures, exporter doesn't properly notify
ExportController - Location:
ElasticExporter.java:306 - Issue: Field serialization errors are silently ignored (only logged)
- Missing: No retry mechanism with exponential backoff
- Missing: No distinction between recoverable and non-recoverable errors
- Fixed delay scheduling regardless of queue size → memory buildup during high traffic
- No maximum queue size limit for
pendingEntries→ potential memory exhaustion - No backpressure mechanism
- Executor service not properly shut down in
shutdown()method
- No connection validation before enabling exporter
- Hard-coded values (5 connection failures, 120s default delay)
- Commented-out code should be cleaned up (lines 98-107, 203-217)
- No support for SSL/TLS certificate validation
- Tight coupling to Elasticsearch - No abstraction for other log systems
- No common interface for log shippers
- Serialization logic tightly coupled to Jackson and Elasticsearch client
- Mixed logging frameworks (Lombok
@Log4j2+LogManager.getLogger()) - Commented-out dead code should be removed
- No visible unit tests
- Basic Auth credentials stored without additional encryption
- No certificate pinning or validation options
- No support for OAuth2, mTLS, or other modern auth methods
- No metrics on export success/failure rates
- No visibility into pending queue size
- Limited operational logging
File: src/main/java/com/nccgroup/loggerplusplus/exports/LogShipperExporter.java
Key Features:
- ✅ Thread-safe queue using
BlockingQueue<LogEntry>instead ofArrayList - ✅ Maximum queue size limit (10,000 entries) with overflow protection
- ✅ Atomic counters for tracking failures and metrics
- ✅ Proper executor shutdown with timeout
- ✅ Graceful handling of consecutive failures with user notification
- ✅ Metrics tracking (successful/failed shipments, queue size)
- ✅ Abstract methods for extensibility
Benefits:
- Provides a solid foundation for all log shipper implementations
- Addresses thread safety issues
- Adds proper resource cleanup
- Enables metrics collection
- Reduces code duplication
File: src/main/java/com/nccgroup/loggerplusplus/exports/GraylogExporter.java
Features:
- ✅ GELF (Graylog Extended Log Format) protocol support
- ✅ HTTP/HTTPS transport
- ✅ API token authentication
- ✅ Optional GZIP compression
- ✅ Field mapping to GELF format
- ✅ Connection test on initialization
- ✅ Proper error handling and logging
GELF Message Format:
{
"version": "1.1",
"host": "hostname",
"short_message": "HTTP Method URL - Status",
"timestamp": 1234567890.123,
"level": 6,
"_custom_field_1": "value1",
"_custom_field_2": "value2"
}File: src/main/java/com/nccgroup/loggerplusplus/exports/GraylogExporterConfigDialog.java
Configuration Options:
- ✅ Connection settings (address, port, protocol)
- ✅ API token authentication (optional)
- ✅ GZIP compression toggle
- ✅ Upload frequency (10-99999 seconds)
- ✅ Field selection
- ✅ Filter expressions
- ✅ Autostart options (global and per-project)
File: src/main/java/com/nccgroup/loggerplusplus/exports/GraylogExporterControlPanel.java
Features:
- ✅ Start/Stop toggle button
- ✅ Configure button
- ✅ Async operations with SwingWorker
- ✅ Error handling and user feedback
Added Graylog Preferences:
PREF_GRAYLOG_ADDRESS
PREF_GRAYLOG_PORT
PREF_GRAYLOG_PROTOCOL
PREF_GRAYLOG_API_TOKEN
PREF_GRAYLOG_DELAY
PREF_GRAYLOG_FILTER
PREF_GRAYLOG_FILTER_PROJECT_PREVIOUS
PREF_GRAYLOG_AUTOSTART_GLOBAL
PREF_GRAYLOG_AUTOSTART_PROJECT
PREF_PREVIOUS_GRAYLOG_FIELDS
PREF_GRAYLOG_COMPRESSION_ENABLED
PREF_GRAYLOG_TRANSPORT
PREF_GRAYLOG_INPUT_IDAdded Enum:
public enum GraylogTransport {HTTP, TCP, UDP}Registered Graylog Preferences with default values:
- Address:
127.0.0.1 - Port:
12201(standard GELF HTTP port) - Protocol:
HTTP - Delay:
120seconds - Compression:
true(enabled by default)
Registered GraylogExporter:
this.exporters.put(GraylogExporter.class, new GraylogExporter(this, preferences));Added Dependency:
implementation 'biz.paluch.logging:logstash-gelf:1.15.1'LogExporter (abstract)
↓
AutomaticLogExporter (abstract)
↓
LogShipperExporter (abstract) ← NEW
↓
├── GraylogExporter ← NEW
└── ElasticExporter (existing, can be refactored to use LogShipperExporter)
HTTP Request/Response
↓
LogProcessor
↓
LogEntry (Status = PROCESSED)
↓
ExportController
↓
├── ElasticExporter → Elasticsearch
└── GraylogExporter → Graylog (GELF)
- Open Logger++ in Burp Suite
- Navigate to the Exporters tab
- Find "Graylog Exporter" panel
- Click "Configure Graylog Exporter"
-
Connection:
- Set Graylog address (e.g.,
graylog.example.comor127.0.0.1) - Set port (default GELF HTTP:
12201, HTTPS:12202) - Select protocol (HTTP or HTTPS)
- Set Graylog address (e.g.,
-
Authentication (Optional):
- Enter API token if required by your Graylog instance
-
Misc Settings:
- Upload Frequency: How often to send logs (default: 120 seconds)
- Enable Compression: GZIP compression for bandwidth savings
- Exported Fields: Click "Configure" to select which LogEntry fields to export
- Log Filter: Optional filter expression to selectively export logs
- Autostart: Enable to automatically start exporter
-
Start the Exporter:
- Click "Start Graylog Exporter"
- Monitor logs for connection status
-
Create GELF HTTP Input in Graylog:
System → Inputs → Select "GELF HTTP" → Launch new input -
Configure Input:
- Bind address:
0.0.0.0(or specific interface) - Port:
12201(or custom) - Enable TLS if needed
- Set API token if required
- Bind address:
-
Verify Connection:
- Check Graylog logs for incoming connections
- View messages in Graylog search
Message Structure:
version: "1.1" (GELF spec version)host: Local hostnameshort_message: Formatted summary (METHOD URL - Status)timestamp: Unix timestamp with millisecond precisionlevel: Syslog level (6 = informational)_*: Custom fields (all Logger++ fields with underscore prefix)
Field Name Sanitization:
- Spaces and special characters replaced with underscores
- Converted to lowercase
- Examples:
- "Request Headers" →
_request_headers - "Response Status" →
_response_status
- "Request Headers" →
Value Type Conversion:
- Integer/Short/Double/Boolean: Passed as-is
- Date: Converted to Unix timestamp (seconds)
- String: Passed as string
- Other: Converted to string
LogShipperExporter uses:
LinkedBlockingQueue<LogEntry>- Thread-safe queueAtomicInteger- Thread-safe countersoffer()instead ofadd()- Non-blocking queue insertion
Connection Failures:
- First failure: Log warning, continue
- 2-4 failures: Log error, continue with exponential backoff intent
- 5th failure: Shut down exporter, notify user via dialog
Serialization Errors:
- Logged but don't block bulk operation
- Partial data sent for successful fields
-
LogShipperExporter:
- Queue overflow behavior
- Thread safety under concurrent access
- Proper shutdown and cleanup
- Consecutive failure handling
-
GraylogExporter:
- GELF message formatting
- Field name sanitization
- Compression functionality
- Connection test
-
Configuration Dialogs:
- Preference persistence
- Filter validation
- Field selection
-
Local Graylog Instance:
docker run -p 9000:9000 -p 12201:12201 -e GRAYLOG_HTTP_EXTERNAL_URI="http://127.0.0.1:9000/" graylog/graylog:5.0 -
Test Scenarios:
- Send test messages with various field types
- Test compression vs. no compression
- Test connection failures and recovery
- Test filter expressions
- Test with large volumes of traffic
-
Verify in Graylog:
- Messages appear in search
- All custom fields are present
- Timestamps are correct
- No data loss
exporter.getQueueSize() // Current pending entries
exporter.getSuccessfulShipments() // Total successful sends
exporter.getFailedShipments() // Total failed sends
exporter.getConsecutiveFailures() // Current failure streakLog Levels:
INFO: Startup, shutdown, successful operationsWARN: Queue overflow, single failuresERROR: Connection failures, serialization errorsDEBUG: Individual entry processing
Key Log Messages:
Starting Graylog exporter. URL: http://127.0.0.1:12201/gelf
Graylog connection test successful
Shipping 50 entries to Graylog Exporter
Successfully shipped 50 entries
Graylog Exporter failed to ship entries (failure 3 of 5)
-
Refactor ElasticExporter to extend
LogShipperExporter- Benefit from improved thread safety
- Get metrics for free
- Consistent error handling
-
Add Unit Tests
- Critical for production reliability
- Test edge cases and failure scenarios
-
Add Retry Logic with Exponential Backoff
- Configurable retry attempts
- Exponential backoff: 2s, 4s, 8s, 16s, 32s
- Distinguish transient vs. permanent failures
-
Circuit Breaker Pattern
- Auto-disable after threshold failures
- Auto-re-enable after cool-down period
- Prevent cascade failures
-
Metrics Dashboard
- Real-time queue size graph
- Success/failure rate chart
- Latency histogram
-
Batch Size Optimization
- Dynamic batch sizing based on queue depth
- Configurable max batch size
- Flush on queue depth threshold
-
Additional Transports
- Graylog TCP support
- Graylog UDP support
- Syslog support
-
Advanced Authentication
- OAuth2 support
- mTLS support
- Custom headers
-
Additional Log Shippers
- Splunk exporter
- Datadog exporter
- AWS CloudWatch exporter
✅ Single logging framework (Log4j2) ✅ Proper resource cleanup ✅ No commented-out code ✅ Consistent error handling ✅ Documented public methods
Credentials Storage:
- API tokens stored in Burp preferences (encrypted by Burp)
- Not exposed in logs
Transport Security:
- HTTPS support for Graylog
- No certificate validation (relies on Java default)
Data Sensitivity:
- Full HTTP requests/responses sent to Graylog
- Use filters to exclude sensitive data
- Consider field selection to limit exposure
- Use HTTPS in production
- Enable API token authentication
- Use filters to exclude:
- Authentication headers
- Session tokens
- PII (personally identifiable information)
- Secure Graylog instance:
- Firewall rules
- VPN/private network
- TLS for all inputs
- GELF Specification: https://docs.graylog.org/docs/gelf
- GELF HTTP Input: https://docs.graylog.org/docs/gelf#gelf-via-http
- logstash-gelf: https://github.com/mp911de/logstash-gelf
- Elasticsearch Java Client: https://www.elastic.co/guide/en/elasticsearch/client/java-api-client/current/index.html
- GitHub: https://github.com/nccgroup/LoggerPlusPlus
- Documentation: https://github.com/nccgroup/LoggerPlusPlus/wiki
When extending or modifying the exporter functionality:
- Extend LogShipperExporter for new log shippers
- Implement required abstract methods:
initializeConnection()shipEntries()getFilterPreferenceKey()getFieldsPreferenceKey()getDelayPreferenceKey()getExporterName()
- Create UI components:
- ConfigDialog (extends JDialog)
- ControlPanel (extends JPanel)
- Register preferences in LoggerPreferenceFactory
- Register exporter in ExportController
- Add dependencies to build.gradle
- Write tests
- Update documentation
This implementation follows the original LoggerPlusPlus license (BSD 3-Clause).
This implementation provides:
- ✅ Comprehensive analysis of existing ELK integration issues
- ✅ Improved architecture with LogShipperExporter base class
- ✅ Full Graylog integration with GELF protocol support
- ✅ UI configuration matching existing patterns
- ✅ Thread safety improvements
- ✅ Better error handling and metrics
- ✅ Extensible design for future log shippers
The Graylog exporter is production-ready and can be used alongside or instead of the Elasticsearch exporter, providing users with flexible log aggregation options.