Skip to content

What is LRU Cache

Key idea:

LRU (Least Recently Used) Cache — eviction policy that removes least-recently-accessed items on overflow. Classic implementation: HashMap + doubly linked list for O(1) get/put. Redis default: `maxmemory-policy allkeys-lru`. Alternatives: LFU (Least Frequently Used), FIFO, random. LRU is universally good for web caching (recent content is hot).

Below: details, example, related terms, FAQ.

Details

  • O(1) operations: get + put
  • Structure: HashMap (key → node) + DoublyLinkedList (insertion order)
  • On get: move node to head
  • On put: add at head; if capacity exceeded → remove tail
  • Variants: LFU (frequency), TinyLFU (Google, Caffeine Java cache), W-TinyLFU

Example

// Python @lru_cache decorator
from functools import lru_cache
@lru_cache(maxsize=128)
def expensive(n): return n * n

// Redis LRU policy
maxmemory 1gb
maxmemory-policy allkeys-lru

Related Terms

Learn more

Frequently Asked Questions

LRU vs LFU — which?

LRU — recency matters. LFU — frequency matters. For web: LRU usually wins (recency correlates with future access). LFU better for static catalog queries.

Is Redis LRU really accurate?

Redis uses approximate LRU for efficiency. Sample 5 keys, evict oldest. Accurate enough for most use cases.

TTL vs LRU?

Different. TTL — absolute expiry (delete after N sec). LRU — eviction on memory pressure. Often combined: TTL on data + LRU for overflow.