Skip to content

Implementing an O(1) – LRU Cache

Table of Content

In this post we will explore how to implement a (L)east (R)ecently (U)sed Cache with an O(1) time complexity. For more info on what exactly an O(1) time complexity is, check out my post on Big O notation. This problem originally comes from LeetCode.com, problem #146… and I encourage you to try your hand at the problem yourself using your favorite language once you understand how to go about the implementation.


The problem statement

Lets jump right into the problem statement to better understand what is required.

Design and implement a data structure for Least Recently Used (LRU) cache. It should support the following operations: getand put.

get(key)– Get the value (will always be positive) of the key if the key exists in the cache, otherwise return -1.
put(key, value)– Set or insert the value if the key is not already present. When the cache reached its capacity, it should invalidate the least recently used item before inserting a new item.

The cache is initialized with a positive capacity.

Follow up:
Could you do both operations in O(1)time complexity?

Breaking down the requirements

We are to create a data structure that has two methods so lets quickly define an interface that represents this requirement.

/// <summary>
/// Interface that defines the methods used to interact with the LRU cache implementation.
/// </summary>
interface ILRUCache
{
    /// <summary>
    /// Used to resolve the value for a cached key.
    /// - Key will always be a positive number.
    /// - Resolution of a known key makes it the most recently used.
    /// </summary>
    /// <param name="key">Key to resolve.</param>  
    /// <returns>Cached value when found, otherwise -1</returns>
        int Get( int key );

    /// <summary>
    /// Used to add a key / value pair into the cache.
    /// - Added to the cache if it is not already present.
    /// - Updates the key as being the most recently used if already present.
    /// - Evicts the least recently used item to make room for this one, if at max capacity.
    /// </summary>  
    /// <param name="key">Key to add.</param>  
    /// <param name="key">Value associated with the key.</param>  
    void Put( int key, int value );
}

We can see from the interface above, the minimum requirements are simple… but the problem is not quite as simple as it as may seem. The added complexity comes from the need to implement this data structure with an O(1) time complexity.

The O(1) time complexity requirement simply means that it should take no more processing to use this interface when the cache has a capacity of 2, than it would with a much larger capacity. This means we can’t use a brute force sorting algorithm or an iteration of the cached values… as that would not be O(1).


The data structures

To implement a solution for this problem we need to have a solid understanding of the available data structures, to include their strengths as well as their limitations. I won’t cover them all but lets have a quick look at the data structures we will use in the solution.

The HashTable

This data structure goes by several names across multiple programming languages, but at it’s core it holds a collection of Key / Value pairs. The strength of this data structure is that it allows us to look up a value by its key, and it does so very fast …"close to O(1)"… or amortized O(1).

In our implementation this will take the form of a Dictionary<int, CachedItem<int>>.

Doubly Linked List

The doubly linked list is an implementation of a linked list that allows for bi-directional navigation across its chained items. This differs from a singly linked list in that a singly linked list only allows traversal in a single direction (which is very inefficient if you need to iterate in reverse). The doubly linked list however does not suffer that problem since it has the ability to navigate in both directions by holding references to both the "Next" and "Previous" elements in the chain. This has a downfall of using more memory resources than the singly linked list but this is a trade-off for the ability to iterate in both directions.

It is worth mentioning that .NET comes with a built-in implementation of this data structure, but for the purposes of the educational post, we will be implementing our own.


Implementing a solution

The solution requires that we define an object that will be cached and a list implementation that will manage those items.

CachedItem<T>

Defines the item to be cached. This is most commonly called the Node or Node<T>… But for the purposes of this example we will be a bit more explicit and call it the CachedItem<T>.

/// <summary>
/// Defines a single cached item in the double linked list.
/// </summary>
/// <typeparam name="T">Datatype of the value to be stored.</typeparam>
public class CachedItem<T> {

    /// <summary>
    /// Reference to the cached item one slot before this item.
    /// </summary>
    public CachedItem<T> Prev { get; set; }

    /// <summary>
    /// Reference to the cached item one slot after this item.
    /// </summary>
    public CachedItem<T> Next { get; set; }

    /// <summary>
    /// The cached value.
    /// </summary>
    public T Value { get; set; }

    /// <summary>
    /// The associated key.
    /// </summary>
    public int Key { get; set; }
}

LRUCache

This is the // Very well commented implementation of the ILRUCache itself.

/// <summary>
/// Least recently used cache implementation.
/// </summary>
public class LRUCache : ILRUCache {

    /// <summary>
        /// Least recently used buffer.
      /// - Commonly named the Tail.
      /// </summary>
    private CachedItem<int> lruBuffer { get; set; }

    /// <summary>
        /// Most recently used buffer.
      /// - Commonly named the Head.
      /// </summary>
    private CachedItem<int> mruBuffer { get; set; }

    /// <summary>
        /// Dictionary of key / value pairs.
      /// </summary>
    private Dictionary<int, CachedItem<int>> keys { get; set; }

    /// <summary>
        /// Maximum capacity for the cache.
      /// </summary>
    private int max { get; set; }

    /// <summary>
        /// Adds an item to the cache.
      /// - Items are inserted into the most recently used slot.
      /// </summary>
    private void addItem(CachedItem<int> item) {

        // Get the current most recently used item.
        // - The one just previous to the mru buffer.
        var oldMru = this.mruBuffer.Prev;

        // We are adding a new item, so it becomes the most recently used.
        // - We link the new item as the Next of the old mru
        // - and also as the Prev of the mruBuffer.
        // -- Note:
        // -- This is the same as inserting after
        // -- the old mru item and before the buffer.
        oldMru.Next = item;
        this.mruBuffer.Prev = item;

        // We must also update the pointers both on
          // the new item as well as the mru buffer.
        item.Next = this.mruBuffer;
        item.Prev = oldMru;

          // We then update the dictionary, such that the key
          // points the the CachedItem object.
        this.keys.Add(item.Key, item);
    }

    /// <summary>
        /// Removes the passed item from the cache.
      /// - The item is de-referenced, functionally removing it from the chain.
      /// </summary>
    private void removeItem(CachedItem<int> item) {

        // In order to remove the item, we just de-reference
        // the linked items that points to it... allowing the
        // next item in the cache to take its place.
        CachedItem<int> prev = item.Prev;
        CachedItem<int> next = item.Next;

        // Skip the item via the item just before it... by
        // setting the previous item's next property to that
        // of the item just after the one we are removing.
        prev.Next = next;

        // Do the same thing (only the inverse) with the
        // next item.
        next.Prev = prev;

        // Remove the key from the dictionary.
        this.keys.Remove(item.Key);
    }

    /// <summary>
        /// Self populating constructor.
      /// </summary>
        /// <param name="capacity">Max number of items that the cache can hold.</param>
    public LRUCache(int capacity) {

        // Instantiate the dict.
        this.keys = new Dictionary<int, CachedItem<int>>();

        // Init the Buffers.
        this.lruBuffer = new CachedItem<int> { Value = 0 };
        this.mruBuffer = new CachedItem<int> { Value = 0 };

        // Link lru / mru buffers to each other.
        this.lruBuffer.Next = this.mruBuffer;
        this.lruBuffer.Prev = this.mruBuffer;
        this.mruBuffer.Next = this.lruBuffer;
        this.mruBuffer.Prev = this.lruBuffer;

        // Set the max capacity.
        this.max = capacity;
    }

    /// <summary>
    /// Used to resolve the value for a cached key.
    /// - Key will always be a positive number.
    /// - Resolution of a known key makes it the most recently used.
    /// </summary>
    /// <param name="key">Key to resolve.</param>
    /// <returns>Cached value when found, otherwise -1</returns>
    public int Get(int key) {

        // Try to get the cached item by key.
        CachedItem<int> item;
        if(!this.keys.TryGetValue(key, out item))
            return -1;

        // Remove from current spot in the chain.
          // - We do this so we can re-add it as the mru item.
        this.removeItem(item);

        // Add as the most recently used.
        this.addItem(item);

        // Either way, return the value.
        return item.Value;
    }

    /// <summary>
    /// Used to add a key / value pair into the cache.
    /// - Added to the cache if it is not already present.
    /// - Updates the key as being the most recently used if already present.
    /// - Evicts the least recently used item to make room for this one, if at max capacity.
    /// </summary>
    /// <param name="key">Key to add.</param>
    /// <param name="key">Value associated with the key.</param>
    public void Put(int key, int value) {

        // Try to get the cached item by key.
        if(this.keys.ContainsKey(key)) {

            // The key exists already... so we need to remove it.
              // - This is done so we can re-add it as the mru item.
            this.removeItem(this.keys[key]);
        }
        else if(this.keys.Count == this.max) {

            // The max has been reached and we need to make room
            // for this new key.
            // So we remove the lru item ( the one just after the lru buffer ).
            this.removeItem(this.lruBuffer.Next);
        }

        // Add the new item as the mru item.
        this.addItem(new CachedItem<int> { Value = value });
    }
}

There you have it! — An O(1) LRU Cache

I hope this helps some of you to have a better understanding of how to implement an LRU Cache. While this example is written in C# the logic remains the same across programming languages.

I hope you enjoyed the read!

Published inTechnical Articles

Comments are closed.