Backend Caching Strategies by Brij Kishore

Source Link

https://www.linkedin.com/posts/brijpandeyji_imagine-youre-in-a-library-the-books-activity-7098264772470525952-bjC-?utm_source=share&utm_medium=member_desktop


Imagine you're in a library. 📚

The books most frequently read are placed in an easily accessible shelf right next to the reading area.

Why?

To speed up access and improve the reading experience.

This is exactly what caching does in the realm of computing.

Think of cache as a high-speed storage mechanism that can be either a reserved section of main memory or an independent high-speed storage device.

It's a place to store a subset of data - typically, the most frequently used data - in a location where it can be accessed quickly.

Caching is used extensively across various facets of computer systems - in hardware (like CPU cache), operating systems, web browsers, and of course, backend development.

Today, we're diving into its role in the backend side of things, which is absolutely vital to understand for anyone working on or studying about system performance and efficiency.

Now, why is backend caching important?

When dealing with web applications, caching on the server-side can significantly enhance performance by reducing database load, cutting down latency, and improving request handling capacity.

It's like having a mini database that is faster and quickly accessible, storing data that is frequently needed. 🚀

No, we're going to focus on different backend caching strategies, such as Read Through, Write Through, Write Around, Write Back, and Cache Aside.

𝗥𝗲𝗮𝗱 𝗧𝗵𝗿𝗼𝘂𝗴𝗵:

1. The client requests data.
2. The cache intercepts the request.
3. If the data is in the cache (cache hit), it is returned to the client.
4. If the data is not in the cache (cache miss), the cache fetches the data from the database.
5. The cache stores a copy of the data and returns it to the client.
   
𝗪𝗿𝗶𝘁𝗲 𝗧𝗵𝗿𝗼𝘂𝗴𝗵:

1. The client writes data.
2. The write operation is directed at the cache.
3. The cache updates its data.
4. The cache then ensures that the data is also written to the database.
5. The cache then acknowledges the write operation to the client.
   
𝗪𝗿𝗶𝘁𝗲 𝗔𝗿𝗼𝘂𝗻𝗱:

1. The client writes data.
2. The write operation is directed only at the database, bypassing the cache.
3. The data does not immediately enter the cache.
   
𝗪𝗿𝗶𝘁𝗲 𝗕𝗮𝗰𝗸 (𝗼𝗿 𝗪𝗿𝗶𝘁𝗲 𝗕𝗲𝗵𝗶𝗻𝗱):

• The client writes data.
• The write operation is directed at the cache.
• The cache updates its data and immediately acknowledges the write operation to the client.
• The cache then asynchronously writes the data to the database.
  
𝗖𝗮𝗰𝗵𝗲 𝗔𝘀𝗶𝗱𝗲 (𝗼𝗿 𝗟𝗮𝘇𝘆 𝗟𝗼𝗮𝗱𝗶𝗻𝗴):

1. The client requests data.
2. If the data is in the cache (cache hit), it is returned to the client.
3. If the data is not in the cache (cache miss), the client fetches the data from the database.
4. The client then writes any data it reads into the cache.



You May Also Like

0 comments