Skip to content

🧠 What Is Caching? (Simple Definition)

Caching means storing data temporarily in a fast place (RAM) so that we don’t repeatedly go to a slow place (database).

Think of it like:

  • Instead of opening the fridge (slow)

  • You keep your favourite drink on your table (fast)


⚑ Why Caching Is Needed?

Without cache:

  • Every request β†’ goes to DB

  • DB becomes slow

  • App becomes slow

  • High load = DB crashes

With cache:

  • Only first request hits DB

  • Next thousands of requests hit cache

  • System becomes super fast


πŸ“˜ Understanding Your First Diagram β€” Cache Aside Pattern

This diagram shows one of the MOST COMMON caching strategies:
πŸ‘‰ Cache-Aside (Lazy Loading)

Let's break it down. Cache-Aside Pattern


🟒 1. First Request (Cache MISS)

(Cache MISS = Not found in cache)

πŸ”Ή Step-by-step:

Step 1: Web server asks cache β†’ "Do you have this data?"

Cache replies β†’ ❌ No

Step 2: Web server goes to the database

DB returns the actual data β†’ βœ”οΈ

Step 3: Web server stores that data into cache

Cache now saves the data in memory β†’ ⚑

Step 4: Web server returns the response to user


🟑 2. Subsequent Requests (Cache HIT)

(Cache HIT = Found in cache)

Now the data is already stored in cache.

πŸ”Ή Step-by-step:

Step 1: Web server again asks cache

This time cache replies β†’ βœ”οΈ Yes I have it

Step 2: Cache instantly returns data

Only 1–5 milliseconds β†’ ⚑ SUPER FAST

No database is touched.
No extra load.


🟠 Why This Pattern Is Called β€œLazy Loading”?

Because:

  • System does NOT store in cache by default

  • It waits until someone asks for the data the first time

Only when the first request comes β†’ cache is filled.


πŸ‘€ Understanding Your Second Diagram β€” Popular Profiles Cache

This is an example of using cache for hot data (frequently accessed data). Popular Profiles Cache

Example:

Think of Instagram:

  • Some users have millions of profile visits

  • Their profiles would overload DB

So what Instagram does:

πŸ‘‰ It caches β€œpopular profiles”
πŸ‘‰ All requests for these profiles go directly to cache


πŸ“˜ Flow in This Diagram

1️⃣ User requests a profile

Web server receives the request

2️⃣ Web server sends request to cache

If profile is popular β†’ it already exists

3️⃣ Cache returns the profile

Super fast β†’ no DB hit

Not for every request


🧰 Why Redis Is Used for Cache?

Redis is perfect for this because:

βœ” It stores data in RAM (insanely fast)
βœ” Distributed (works across multiple servers)
βœ” Supports TTL (auto-expire data)
βœ” Good for session storage
βœ” Good for rate limiting
βœ” Good for real-time counters
βœ” Most popular cache tool globally


πŸ—οΈ Full Cache Tutorial (Easy to Understand)


1️⃣ When do you use caching?

You use caching when:

  • Database queries are slow

  • Same data is requested frequently

  • External API calls are expensive

  • You have heavy traffic

Examples:

  • Profile info

  • Leaderboards

  • Product pages

  • Blockchain RPC data

  • Dashboard metrics


2️⃣ Where does Redis sit in your architecture?

Client β†’ Backend β†’ Redis Cache β†’ Database

3️⃣ Popular Cache Strategies (explained like a story)


🟒 A. Cache Aside (Lazy Load) β€” Most used

Already explained using your diagram.

Use when:

  • Read-heavy apps

  • Data doesn’t change frequently

Example:

  • Profile info

  • Product details


🟑 B. Write Through Cache

Write goes to:

  1. Cache

  2. Database

At the same time.

Useful when you want cache + DB always in sync.


🟠 C. Write Back (Write Behind)

Write goes only to cache β†’ Redis writes to DB later.

Fastest writes
But can lose data if cache crashes.


4️⃣ Cache Expiry β€” TTL

Every cached key can expire automatically.

Example:

SET user:123 {"name": "Yuva"} EX 60

Meaning:

  • Store profile for 60 seconds

  • After that β†’ auto delete

Helps avoid stale data.


5️⃣ Eviction Policies (When Cache Memory Is Full)

Redis will remove:

  • LRU β†’ Least recently used

  • LFU β†’ Least frequently used

  • TTL β†’ Keys with expiry first

  • NOEVICTION β†’ Return error if full


6️⃣ Cache Problems (Simple Explanation)


❌ Cache Miss

Data not found β†’ Goes to DB.


❌ Cache Stampede

When cache expires β†’ thousands of users hit DB at once.

Solution:

  • Staggered TTL

  • Cache locking

  • Background refresh


❌ Stale Data

Cache contains old information.

Solution:

  • Short TTL

  • Invalidate when updating


πŸͺ„ Real-World Example (Sui Network Monitoring)

When you track:

  • Epoch

  • Checkpoints

  • Transaction count

  • Validator count

  • Gas price

Instead of calling RPC every second:

πŸ‘‰ Store values in Redis for 5–10 seconds
πŸ‘‰ All dashboards read from Redis
πŸ‘‰ Backend becomes lightning fast
πŸ‘‰ RPC node is safe from heavy load


🧩 Summary (Very Simple)

βœ” Cache = fast memory

βœ” Redis = the most powerful caching tool

βœ” Cache Aside = most common approach

βœ” First request β†’ DB

βœ” Next requests β†’ Redis

βœ” Fewer DB hits = faster system

βœ” Great for dashboards, profiles, blockchain data