This example demonstrates the differences between LFU (Least Frequently Used) and LRU (Least Recently Used) cache eviction strategies through side-by-side comparison.
- LFU cache behavior with frequency-based eviction
- LRU cache behavior with recency-based eviction
- Performance comparison between the two strategies
- Use case recommendations for each strategy
- Cache statistics and metrics comparison
- Eviction Strategy: Removes items accessed least frequently
- Best For: Workloads with varying access patterns
- Implementation: TinyLFU algorithm via Ristretto
- Advantage: Keeps "hot" items in cache longer
- Eviction Strategy: Removes items not accessed recently
- Best For: Sequential or temporal access patterns
- Implementation: Doubly-linked list via hashicorp/golang-lru
- Advantage: Simple and predictable behavior
- Go 1.25 or later
- Redis server running on
localhost:6379
# Using Redis directly
redis-server
# Or using Docker
docker run -d -p 6379:6379 redis:latestcd examples/comparison
go run main.goThe example demonstrates both cache types with the same access patterns:
--- LFU Cache Demonstration ---
Adding users with different access frequencies...
User 1: 10 accesses (high frequency)
User 2: 5 accesses (medium frequency)
User 3: 2 accesses (low frequency)
LFU Behavior:
- User 1 (10 accesses) will be kept longest
- User 2 (5 accesses) will be kept moderately
- User 3 (2 accesses) will be evicted first
- Frequency tracking ensures hot items stay cached
Cache Statistics:
Local Hits: 17
Local Misses: 3
Hit Ratio: 85.00%
--- LRU Cache Demonstration ---
Adding users in sequence...
Order: User1 → User2 → User3
Accessing in specific order:
User3 → User1 → User2
LRU Behavior:
- User2 (most recent) will be kept longest
- User1 (moderately recent) will be kept moderately
- User3 (least recent) will be evicted first
- Recency determines eviction priority
Cache Statistics:
Local Hits: 15
Local Misses: 3
Hit Ratio: 83.33%
The example provides a detailed comparison:
========================================
LFU vs LRU Comparison Summary
========================================
LFU (Least Frequently Used):
✓ Tracks access frequency
✓ Better for varying access patterns
✓ Keeps frequently accessed items
✓ More complex implementation
✓ Higher memory overhead
LRU (Least Recently Used):
✓ Tracks access recency
✓ Better for sequential patterns
✓ Keeps recently accessed items
✓ Simpler implementation
✓ Lower memory overhead
When to Use LFU:
- Access patterns vary significantly
- Some items are much more popular
- You want to maximize hit ratio
- Memory is limited
When to Use LRU:
- Sequential access patterns
- Temporal locality is important
- Predictable behavior needed
- Simplicity is preferred
cfg := dc.DefaultConfig()
cfg.LocalCacheConfig = dc.LocalCacheConfig{
NumCounters: 1e7, // 10M counters for frequency tracking
MaxCost: 1 << 30, // 1GB max cache size
BufferItems: 64, // Buffer for async operations
IgnoreInternalCost: false, // Track actual memory cost
}
cfg.LocalCacheFactory = cache.NewLFUCacheFactory(cfg.LocalCacheConfig)cfg := dc.DefaultConfig()
maxSize := 10000 // Maximum number of items
cfg.LocalCacheFactory = cache.NewLRUCacheFactory(maxSize)- Memory: Higher (frequency counters + cache data)
- CPU: Moderate (frequency tracking overhead)
- Hit Ratio: Excellent for varying patterns
- Eviction: Smart (frequency-based)
- Memory: Lower (just cache data + linked list)
- CPU: Low (simple recency tracking)
- Hit Ratio: Good for sequential patterns
- Eviction: Predictable (recency-based)
-
E-commerce Product Catalog
- Popular products accessed frequently
- Long-tail products accessed rarely
- Want to keep bestsellers in cache
-
Content Delivery
- Viral content gets many hits
- Old content rarely accessed
- Frequency matters more than recency
-
API Rate Limiting
- Track frequent API callers
- Evict infrequent users first
- Optimize for heavy users
-
Session Management
- Recent sessions more likely to be active
- Old sessions can be evicted
- Temporal locality is key
-
Log Processing
- Recent logs accessed more often
- Sequential processing pattern
- Recency matters most
-
Database Query Cache
- Recent queries likely to repeat
- Simple eviction policy needed
- Predictable behavior preferred
- How LFU and LRU differ in eviction behavior
- When to use each strategy based on access patterns
- How to configure each cache type with appropriate parameters
- Performance trade-offs between the two approaches
- How to measure cache effectiveness using statistics
After understanding the comparison, explore:
- LFU Example - Deep dive into LFU cache
- LRU Example - Deep dive into LRU cache
- Custom Local Cache - Implement your own eviction strategy
- Basic Example - Learn basic cache operations
- Main README - Complete library documentation
- Getting Started Guide - Step-by-step setup
- Performance Characteristics - Detailed performance info