Problem Statement
Design and implement a thread-safe LRU (Least Recently Used) cache with O(1) Get and Put operations. This is a classic system design question asked at Google, Meta, and Amazon.
Requirements
Get(key): Return value if exists, mark as recently usedPut(key, value): Insert/update, evict LRU if at capacity- Both operations must be O(1)
- Thread-safe for concurrent access
Data Structure
Implementation
Usage Example
Performance Optimization: Sharded LRU
Follow-up Questions
- How would you implement TTL (time-to-live) for entries?
- How do you handle cache stampede (thundering herd)?
- What's the difference between LRU and LFU?