A Quick Look at Cache Aside

Tram Ho

overview

As the most commonly used cache mechanism, its operating model is as follows:

File_000

This mechanism is usually used for many reads and few writes. It also depends on whether the returned data has changed or not (the primary key query rarely changes).

The commonly used technique is memcache .

The benefit here is the ability to recover data when the cache fails (just query back to the DB).

Noteworthy:

  • Update the DB create/update state cache (using code logic)
  • Cache refresh mechanism (LRU or LFU).

Least Frequently Used (LFU)

Algorithmic content

A caching algorithm, where the cache block with the least number of hits is discarded if the cache is full. In this algorithm we will check the access frequency of each block, old blocks with equal frequency will be removed according to a certain rule (for example FIFO, or will remove block with most recent use furthest from current time).

How it works

Use 2 functions add & get .

  • add(key, value) : add the corresponding (key, value) pair to the cache, first check the capacity, if there is excess, it will be added, if not, it will be removed (key, value) with low frequency of use and the longest end-of-life.
  • get(key) : returns the value corresponding to the key (if the key does not exist, we return the smallest integer value) and moves it to the appropriate location in the cache (based on frequency).

File_000 (1)

Algorithm Deployment

https://github.com/tuananhhedspibk/BlogCode/tree/main/LFUCaching

Least Recently Used (LRU)

main content

This algorithm keeps frequently used items at the top of the cache. When a new item is accessed, it is pushed to the top. When the cache reaches the capacity threshold, less frequently accessed items are removed from the cache from the end of the cache

Work

LRU cache provides 2 methods put & get

  • put(key, value) : set or insert the (key, value) pair. When the cache reaches the capacity threshold, the item with the least number of hits will be removed from the cache (which will be the last element).
  • get(key) : get the value corresponding to the key, if the key does not exist, it will return -1, if the key exists, it will put the item corresponding to the key to the top of the cache.

For example:

LRU caching

Algorithm Deployment

https://github.com/tuananhhedspibk/BlogCode/blob/main/LRUCaching/index.ts

Share the news now

Source : Viblo