Cache Eviction Policy

What is Caching and Why Eviction?

A cache is a smaller, faster memory storage area that stores copies of frequently accessed data from a primary, slower storage (like RAM caching disk data, or a web server caching database results). Accessing data from the cache (a cache hit) is much faster than fetching it from the original source (a cache miss).

Since the cache has limited size, it will eventually become full. When new data needs to be added but the cache is full, an existing item must be removed (evicted) to make space. A cache eviction policy is the algorithm used to decide which item to discard. The goal is to evict the item least likely to be needed soon, maximizing the chance that future requests result in cache hits.

Common Eviction Policies

1. FIFO (First-In, First-Out)

This is the simplest policy. It treats the cache like a queue. The item that has been in the cache the longest is the first one to be evicted, regardless of how often or recently it was accessed.

2. LRU (Least Recently Used)

This policy assumes that data accessed recently is likely to be accessed again soon. It evicts the item that hasn't been accessed for the longest time.

3. LFU (Least Frequently Used)

This policy assumes that data accessed most often is more important and should stay in the cache. It evicts the item that has been accessed the fewest number of times.

Visualize and Play

Set the cache size, select a policy, and access items to see how the cache behaves and which items get evicted.

Cache State (FIFO, Size: 5)

Enter a key and click 'Access Cache'.
Log messages will appear here...