• Backend Weekly
  • Posts
  • Part 6: Understanding System Design - Caching Strategies

Part 6: Understanding System Design - Caching Strategies

In this episode, I will explore the Distributed Caching component more deeply and explore different caching strategies, caching eviction policies, and caching layers.

In partnership with

Hello “👋”

Welcome to another week, another opportunity to become a Great Backend Engineer.

Today’s issue is brought to you by NuxifySaaS, A Nuxt SaaS Boilerplate with ready-to-go components for Payments, Auth, Admin, Blog, SEO, Database, Mailing, Templating system(Landing pages, Dashboards), and everything you need to ship your SaaS in days!.

Before we get down to the business of today. Part 6 of Understanding System Design.

I have a gift for you: You will love this one.

Learn how to make AI work for you.

AI breakthroughs happen every day. But where do you learn to actually apply the tech to your work? Join The Rundown — the world’s largest AI newsletter read by over 600,000 early adopters staying ahead of the curve.

  1. The Rundown’s expert research team spends all day learning what’s new in AI

  2. They send you daily emails on impactful AI tools and how to apply it

  3. You learn how to become 2x more productive by leveraging AI

Now, back to the business of today.

In the previous edition, I discussed one of the system's components, Distributed Caching. You learned everything you needed about distributed caching, its Benefits, what caching is, and more. Check it out here if you haven’t.

In this episode, I will explore the Distributed Caching component more deeply and explore different caching strategies, caching eviction policies, and caching layers.

Overview of Caching Strategies

Caching Strategy for restful API and website performance of any web page is a significant factor. It can affect the user’s experience and the business if not considered and optimized correctly.

There are many different caching strategies available. Depending on your application use case and data structure, you can develop different Custom Caching Strategies to fit perfectly with your project.

  1. Cache Aside(Lazy Loading)

  2. Read Through

  3. Write Through

  4. Write Back (Write Behind)

  5. Write Around

Cache Aside (Lazy Loading)

This is the most popular way of Caching, where the database sits aside in the strategy, and the application requests data from the Cache Server first.

If the data is found in the cache server, the request is returned to the client with the data. However, if the data is not found in the cache server, the server queries the database server for the data, and in return, the data is stored in the Cache Server for subsequent requests.

Most importantly, in this strategy, we noted two important concepts which are:

  • When the data is found in the cache (Cache Hit).

  • When the data is not found in the cache (Cache Miss).

We can create an event and attach a listener to be triggered when any of those happen.

A screenshot of the Cache Aside Strategy

Code Implementation

Let’s look at a simple implementation of the Cache Aside strategy. I will use JavaScript to implement it, but it applies to all programming languages.

public async cacheAsideStrategy(key: string, minutes: number, callback: Function): Promise<any> {
  if (await Cache.has(key)) {
      const data = await Cache.get(key)
      Event.hit(key, data, minutes);
      return data
  } else {
      Event.miss(key, [], minutes)
      // Database Server is called outside the Cache Server.
      const data = await callback()
      await Cache.set(key, data, minutes)
      return data
  }
}

From the code snippet above, you will notice I tried to get the data from my Cache (It could be Redis or Memcached) server, and if it was found, I triggered the Event.hit() method (A custom event I created)

Next, if the key is not found, I search for the data in my database server using the callback() function and trigger the Event.miss() event.

Lastly, we store the data on our Cache Server for subsequent requests.

Read Through

The Read-Through strategy is the direct opposite of the Cache-Aside strategy, in which the cache server sits between the client request and the database server.

When a request comes in, it goes directly to the Cache server, and if there is a miss, the cache server is responsible for retrieving the data from the database server, updating itself for subsequent requests, and returning the data to the client.

Sounds confusing, right?

Let’s take a look at this diagram and the code implementation below:

A screenshot of the Read Through Strategy

Code Implementation

Let’s look at a simple implementation of the Read-Through strategy.

public async readThrough(key: string, minutes: number): Promise<any> {
      const data = await Cache.get(key, minutes)
      return data
}

// Inside Cache Server
private async get(key, minutes){
    const data = await Cache.find(key)
    if(data){
      Event.hit(key, data, minutes);
      return data
    }

    Event.miss(key, [], minutes)
    // Database Server is called from the Cache Server.
    const DBdata = await Database.find(key)
    await Cache.set(key, DBdata, minutes)
    return DBdata
}

At a glance, the Read-Through Strategy is very similar to the Lazy Loading Strategy we discussed above. The only difference is that the Cache Server calls the Database Server on any Cache Miss.

Write Through

When implementing a caching strategy for Write Operations, the Write-Through strategy is similar to the Read-Through strategy for only write operations.

Every Write Operation must go through the Cache Server before the Database Server in this strategy.

A screenshot of Write-Through operation

Code Implementation

Let’s look at a simple implementation of the Write-Through strategy.

public async writeThrough(key: string, data: any, minutes: number): Promise<any> {
    const cacheData = await Cache.set(key, data, minutes)
    return cacheData
}

private async set(key: string, data: any, minutes: number){
    await Cache.set(key, data, minutes)
    // Database Server is called from the Cache Server.
    await Database.create(data)
    return data
}

The implementation is similar to the Read-Through Strategy used for the Write Operations. We can also improve the implementation by checking and updating the Cache Server with the new data if the key exists on the Cache Server.

Write Back (Write Behind)

This strategy is a more advanced version of the Write Through Strategy and can also be called Write Behind.

It has the same structure: every write operation goes through the cache server before going to the database server, but there is a delay when writing the data to the database server.

So, a client can send in 5 write requests, and it will store them all in the Cache Server. The cache Server will only flush all the updated data to the Database Server every 1 minute (or more) interval.

Let’s take a look at this diagram and the code implementation below:

Code Implementation

Let’s look at a simple implementation of the Write-Back strategy.

const durationToFlush: number = 1; (in minute)
const tempDataToFlush: Array = [];

  public async writeThrough(key: string, data: any, minutes: number): Promise<any> {
      const cacheData = await Cache.set(key, data, minutes)
      return cacheData
  }

//----------Inside Cache Server---------------------

  private async set(key: string, data: any, minutes: number){
      await Redis.set(key, data, minutes)
      this.storeForUpdates(data)
      return data
  }

// Stores new data to temp Array for updating
  private storeForUpdates(data: any){
    const tempData = {}
    tempData['duration'] = this.getMinutesInMilli();
    tempData['data'] = data
    tempDataToFlush.push(data)
  }

// Converts minutes to millisecond
private getMinutesInMilli(){
  const currentDate = new Date()
  const futureDate = new Date(currentDate.getTime() + this.durationToFlush * 60000)
  return futureDate.getTime()
}

// Calls to update the Database Server.
public async updateDatabaseServer(){
  if(this.tempDataToFlush){
    this.tempDataToFlush.forEach((obj, index) => {
      if(obj.duration <= new Date().getTime()){
        if(await Database.create(obj.data)){
            this.tempDataToFlush.splice(index, 1)
        }
      }
    })
  }
}

Set up a Cron Job to run every minute to update the Database Server by calling the updateDatabaseServer() method. You can read how to set up Cron Jobs differently with Laravel Cron: The Definitive Guide article.

So far, the implementation is a bit long and tedious, and the code design is imperfect. Implementing a production-ready Write-Back Strategy that tempDataToFlush should be redesigned to use proper data structure.

The above code is an untested implementation of the Write-Back Strategy. Let me know if you notice any bugs and how to improve the implementation.

Write Around

This strategy combines both Cache Aside and Read Through Strategies together. In this strategy, all Write Operations go directly to the Database Server, and only Read operation updates the Cache Server.

For example, if a user wants to create a new Post, the Post is stored directly in the Database Server, and when the user wants to read the Post’s content for the first time. The Post is gotten from the Database Server and stored in the Cache Server for subsequent requests before returning the Post content to the user.

Code Implementation

Let’s look at a simple implementation of the Write-Back strategy.

public async writeAround(key: string, data: any, minutes: number): Promise<any> {
    const storedData = await Database.create(data)
    await Cache.set(key, data, minutes)
    return storedData
}

public async readOperation(key: string, minutes: number){
    const cacheData = await Cache.lazyLoadingStrategy(key, minutes)
    return cacheData
}

The Write-Around strategy combines different strategies and can be customized to fit the project and the operations performed on the data.

I have utilized the lazyLoadingStrategy() Write Operations goes straight to the Database Server before updating the Cache Server for the read operations.

I may not have mentioned all the Caching Strategies for distributed caching, and you can even use a custom Caching Strategy that best suits your project requirements.

That will be all for this week. I like to keep this newsletter short.

Today, I discussed further distributed caching and introduced you to different caching strategies and how to implement them.

Next week, I will finish the distributed caching by exploring topics such as the Caching Eviction Policy and Cache Layers.

Don’t miss it. Share with a friend

Did you learn any new things from this newsletter this week? Please reply to this email and let me know. Feedback like this encourages me to keep going.

That will be all for this one. See you on Saturday.

Remember to get the NuxifySaaS Boilerplate. It offers unmatched benefits for building products in days with Nuxt.js.

Top 5 Remote Backend Jobs this week

Here are the top 5 Backend Jobs you can apply to now.

👨‍💻 BandLab Technologies
✍️ Senior Backend Engineer, Ads Team
📍Remote, Asia, Africa, Europe
💰 Click on Apply for salary details
Click here to Apply for this role.

👨‍💻 WunderGraph
✍️ Senior Golang Engineer - Remote (EMEA)
📍Remote, Go, Golang
💰 Click on Apply for salary details
Click here to Apply for this role.

👨‍💻 Fincra
✍️ Senior Backend Engineer
📍Remote, Africa
💰 Click on Apply for salary details
Click here to Apply for this role.

👨‍💻 Pesto Tech
✍️ Backend Developer
📍Remote
💰 Click on Apply for salary details
Click here to Apply for this role.

Want more Remote Backend Jobs? Visit GetBackendJobs.com

Backend Engineering Resources

Whenever you're ready

There are 4 ways I can help you become a great backend engineer:

1. The MB Platform: Join 1000+ backend engineers learning backend engineering on the MB platform. Build real-world backend projects, track your learnings and set schedules, learn from expert-vetted courses and roadmaps, and solve backend engineering tasks, exercises, and challenges.

2. The MB Academy:​ The “MB Academy” is a 6-month intensive Advanced Backend Engineering BootCamp to produce great backend engineers.

3. MB Video-Based Courses: Join 1000+ backend engineers who learn from our meticulously crafted courses designed to empower you with the knowledge and skills you need to excel in backend development.

4. GetBackendJobs: Access 1000+ tailored backend engineering jobs, manage and track all your job applications, create a job streak, and never miss applying. Lastly, you can hire backend engineers anywhere in the world.

LAST WORD 👋 

How am I doing?

I love hearing from readers, and I'm always looking for feedback. How am I doing with The Backend Weekly? Is there anything you'd like to see more or less of? Which aspects of the newsletter do you enjoy the most?

Hit reply and say hello - I'd love to hear from you!

Stay awesome,
Solomon

I moved my newsletter from Substack to Beehiiv, and it's been an amazing journey. Start yours here.

Join the conversation

or to participate.