Design a data structure for a Least Recently Used (LRU) cache that efficiently supports 'get' and 'put' operations

Design a data structure for a Least Recently Used (LRU) cache that efficiently supports 'get' and 'put' operations

Design a data structure for a Least Recently Used (LRU) cache that efficiently supports 'get' and 'put' operations

Approach

To design a data structure for a Least Recently Used (LRU) cache that efficiently supports get and put operations, follow this structured framework:

  1. Understand LRU Cache Requirements:

  • It should store a limited number of items.

  • When the cache reaches its capacity, it should remove the least recently used item.

  • Choose Appropriate Data Structures:

  • Use a Hash Map for O(1) access time to cache items.

  • Use a Doubly Linked List to maintain the order of usage, allowing quick updates when items are accessed or added.

  • Implement Operations:

  • Define the get operation to retrieve items and move them to the front of the usage list.

  • Define the put operation to add items, evicting the least recently used item if necessary.

Key Points

  • Efficiency: Both get and put operations should run in O(1) time complexity.

  • Storage: The cache should have a fixed capacity that can be set upon initialization.

  • Order Maintenance: The order of items in the cache should be updated to reflect usage, with the most recently accessed items at the front.

Standard Response

Here's a sample implementation of an LRU Cache in Python using a hash map and a doubly linked list:

class Node:
 def __init__(self, key=0, value=0):
 self.key = key
 self.value = value
 self.prev = None
 self.next = None

class LRUCache:
 def __init__(self, capacity: int):
 self.capacity = capacity
 self.cache = {} # key -> Node
 self.head = Node() # Dummy head
 self.tail = Node() # Dummy tail
 self.head.next = self.tail
 self.tail.prev = self.head

 def _remove(self, node: Node):
 """Remove a node from the linked list."""
 prev_node = node.prev
 next_node = node.next
 prev_node.next = next_node
 next_node.prev = prev_node

 def _add_to_front(self, node: Node):
 """Add a node right after the head."""
 node.prev = self.head
 node.next = self.head.next
 self.head.next.prev = node
 self.head.next = node

 def get(self, key: int) -> int:
 """Return the value of the key if the key exists, otherwise return -1."""
 if key in self.cache:
 node = self.cache[key]
 self._remove(node)
 self._add_to_front(node)
 return node.value
 return -1

 def put(self, key: int, value: int) -> None:
 """Update the value of the key if the key exists. Otherwise, add the key-value pair."""
 if key in self.cache:
 node = self.cache[key]
 self._remove(node)
 node.value = value
 self._add_to_front(node)
 else:
 if len(self.cache) == self.capacity:
 # Remove the least recently used item (the last node)
 lru_node = self.tail.prev
 self._remove(lru_node)
 del self.cache[lru_node.key]
 new_node = Node(key, value)
 self.cache[key] = new_node
 self._add_to_front(new_node)

Tips & Variations

Common Mistakes to Avoid

  • Ignoring Capacity: Failing to implement eviction logic when the cache is full.

  • Inefficient Data Structures: Using lists for storage can lead to O(n) access times.

  • Not Handling Edge Cases: Forgetting to handle cases where get is called for a non-existent key.

Alternative Ways to Answer

  • For a technical role: Focus on the algorithmic complexity and discuss trade-offs of different data structures.

  • For a managerial role: Emphasize how this design can be scaled or modified for larger systems and discuss potential use cases.

Role-Specific Variations

  • Technical Positions: Discuss the time complexity in-depth and consider alternative implementations (e.g., using only a list).

  • Creative Roles: Relate the LRU cache concept to user experience, explaining how efficient data retrieval can enhance application performance.

Follow-Up Questions

  • How would you modify the cache to support concurrency?

  • What would be the impact of changing the data structure used for the cache?

  • Can you explain how you would test this implementation for edge cases?

By following this structured approach, job seekers can craft comprehensive and effective responses to interview questions about designing data structures, such as an LRU cache. This response not only demonstrates technical ability but

Interview Copilot: Your AI-Powered Personalized Cheatsheet

Interview Copilot: Your AI-Powered Personalized Cheatsheet

Interview Copilot: Your AI-Powered Personalized Cheatsheet