Caching is one of the critical solution to improve performance and reduce latency in data access. Cache is a fast and temporary storage ( generally RAM )

When using cache there are different patterns in which we can implement reading the cache data and writing the cache data.

Cache Aside

Ideal where caching everything is not critical for usecase nor necessary. Application and Datastore control the data, and cache is an optimization sidecar.

Read-Aside Cache ( Lazy Loading )

  • Check Cache: Application checks the cache for data, if Hit, data is used.
  • If cache is Missed, Application reads from Datasource.
  • Update Cache: Application updates the cache after the data is retrieved from Datasource.
sequenceDiagram
    participant App as Application
    participant Cache
    participant DB as Data Source
    App->>Cache: Read data
    alt Cache Hit
        Cache->>App: Return data
    else Cache Miss
        Cache->>App: No data
        App->>DB: Fetch data
        DB->>App: Return data
        App->>Cache: Write data
    end

Write-Aside Cache

With Cache Update

Write can be delayed but can’t afford any read delays due to cache miss.

  • Write to Database: Application writes data directly to datastore.
  • Write to Cache: If successfull, the application write data to Cache.
sequenceDiagram
    participant App as Application
    participant Cache
    participant DB as Data Source
    App->>DB: Write data
    DB-->>App: Acknowledge
    alt Write Success
        App->>Cache: Write data
    end

With Cache Invalidation

Need fast writes and read cache miss is affordable.

  • Write to Database: Application writes data directly to datastore.
  • Invalidate Cache: If successfull, Invalidate the cache
sequenceDiagram
    participant App as Application
    participant Cache
    participant DB as Data Source
    App->>DB: Write data
    DB-->>App: Acknowledge
    alt Write Success
        App-->>Cache: Invalidate
    end
    App->>Cache: Read data
    alt Miss
      App->>DB: Fetch data
      App->>Cache: Write data
    end

The cache will be updated when read operation forces cache update. We force a cache miss when writing.

Cache Through

Ideal where caching is heavily utilized, like real-time and low latency applications. Application and Cache control the data.

Read-Through Cache

  • Read From Cache: Application reads from cache
  • If data is absent, caching system loads the data from datasource
  • Cache returns data to application and stores it for future requests
sequenceDiagram
    participant App as Application
    participant Cache
    participant DB as Data Source
    App->>Cache: Read data
    alt In Cache
        Cache->>App: Return data
    else Not in Cache
        Cache->>DB: Fetch data
        DB->>Cache: Return data
        Cache->>App: Return data
    end

Write-Through Cache

Consistency is instant, write latency is higher

  • Application writes data to cache.
  • Cache system will write the data to Datastore.
  • Application will get write acknowledgement
sequenceDiagram
    participant App as Application
    participant Cache
    participant DB as Data Source
    App->>Cache: Write data
    Cache->>DB: Write data
    Cache->>App: Acknowledge write

Write-Behind Cache

Consistency is eventual, write latency is very low

  • Application writes data to cache.
  • Application will get write acknowledgement
  • Cache system will sync the data to Datastore periodically / later.
sequenceDiagram
    participant App as Application
    participant Cache
    participant DB as Data Source
    App->>Cache: Write data
    Cache->>App: Acknowledge write
    Cache-->DB: Sync data