LRU (Least Recently Used) Cache — eviction policy that removes least-recently-accessed items on overflow. Classic implementation: HashMap + doubly linked list for O(1) get/put. Redis default: `maxmemory-policy allkeys-lru`. Alternatives: LFU (Least Frequently Used), FIFO, random. LRU is universally good for web caching (recent content is hot).
Below: details, example, related terms, FAQ.
// Python @lru_cache decorator
from functools import lru_cache
@lru_cache(maxsize=128)
def expensive(n): return n * n
// Redis LRU policy
maxmemory 1gb
maxmemory-policy allkeys-lruLRU — recency matters. LFU — frequency matters. For web: LRU usually wins (recency correlates with future access). LFU better for static catalog queries.
Redis uses approximate LRU for efficiency. Sample 5 keys, evict oldest. Accurate enough for most use cases.
Different. TTL — absolute expiry (delete after N sec). LRU — eviction on memory pressure. Often combined: TTL on data + LRU for overflow.