Skip to content

๐Ÿš€ NGINX CACHING SERVER โ€” END-TO-END TUTORIAL

NGINX can act as a cache layer between the client and your backend server.
This gives:

โœ” Faster response times
โœ” Less load on backend
โœ” Handles more traffic
โœ” Reduces DB/API usage


๐Ÿ“Œ 1. What is NGINX Caching? (Simple Explanation)

Flow:

Client โ†’ NGINX (Cache Layer) โ†’ Backend Server

When a request comes:

  1. NGINX checks: Is response already cached?

  2. If YES โ†’ returns cached response instantly

  3. If NO โ†’ fetches from backend โ†’ stores cache โ†’ returns response


๐Ÿ“ 2. Prerequisites


๐Ÿงฑ 3. Create Cache Directory

sudo mkdir -p /var/cache/nginx_cache
sudo chmod 777 /var/cache/nginx_cache

โš™๏ธ 4. Configure NGINX Cache

Edit your site config:

sudo nano /etc/nginx/sites-available/cache_server

Paste this:

proxy_cache_path /var/cache/nginx_cache levels=1:2 keys_zone=my_cache:10m max_size=1g inactive=60m use_temp_path=off;

server {
    listen 80;
    server_name example.com;

    location / {
        proxy_cache my_cache;
        proxy_pass http://127.0.0.1:3000;

        proxy_cache_valid 200 302 10m;
        proxy_cache_valid 404 1m;

        proxy_cache_use_stale error timeout updating http_500 http_502 http_503 http_504;

        add_header X-Cache-Status $upstream_cache_status;
    }
}

What each line means:

  • proxy_cache_path โ†’ Where cache files are stored

  • keys_zone=my_cache:10m โ†’ Cache index memory

  • max_size=1g โ†’ Maximum cache size

  • inactive=60m โ†’ Remove unused cache entries

  • proxy_cache_valid โ†’ Cache success for 10 minutes

  • use_stale โ†’ Serve old cache if backend fails

  • X-Cache-Status โ†’ Shows HIT / MISS in response header


๐Ÿ”— 5. Enable the Site

sudo ln -s /etc/nginx/sites-available/cache_server /etc/nginx/sites-enabled/
sudo nginx -t
sudo systemctl reload nginx

๐Ÿงช 6. Test Cache Behavior

Make request:

curl -I http://your-domain

You will see:

  • MISS โ†’ (first request, cache empty)

  • HIT โ†’ (subsequent requests, served from cache)

Example:

X-Cache-Status: MISS

then:

X-Cache-Status: HIT

๐ŸŽ‰ This confirms caching is working.


โšก 7. Cache Control Per Path (Optional)

Cache /api/products/ for 5 minutes:

location /api/products/ {
    proxy_cache my_cache;
    proxy_pass http://127.0.0.1:3000;
    proxy_cache_valid 200 5m;
}

Disable cache for login routes:

location /login {
    proxy_no_cache 1;
    proxy_cache_bypass 1;
    proxy_pass http://127.0.0.1:3000;
}

๐Ÿ” 8. Ignore Cookies / Headers (Optional)

Don't let cookies break caching:

proxy_ignore_headers Cache-Control Expires Set-Cookie;

โญ 9. Microcaching (High-Performance API Cache)

Microcaching caches responses for 1โ€“2 seconds โ€” reduces backend load massively.

proxy_cache_valid 200 1s;

This is used by:

โœ” High-traffic APIs
โœ” E-commerce sites
โœ” Crypto dashboards
โœ” News feeds
โœ” Leaderboards


๐Ÿ”ฅ 10. Purge Cache (Manually)

Clear entire cache:

sudo rm -rf /var/cache/nginx_cache/*
sudo systemctl reload nginx

๐Ÿ“Š 11. Monitoring Cache

View cache folder:

ls /var/cache/nginx_cache

Monitor logs:

tail -f /var/log/nginx/access.log
tail -f /var/log/nginx/error.log

๐Ÿ›ก 12. Combine with Reverse Proxy + Load Balancer

NGINX can do all 3 together:

Client โ†’ NGINX (Cache + SSL) โ†’ Load Balancer โ†’ Backend API cluster

Let me know if you want this architecture too.


๐ŸŽ‰ Caching Server Setup Completed!

You now have:

โœ” NGINX caching layer
โœ” File system cache
โœ” Per-path caching rules
โœ” Microcaching
โœ” Cache purge
โœ” Cache monitoring
โœ” Failover with stale cache


โ“ What do you want next?

I can provide:

๐Ÿ”น CDN-style caching (advanced)
๐Ÿ”น Cache + Load Balancer + Reverse Proxy combo
๐Ÿ”น High-availability caching architecture
๐Ÿ”น Cache invalidation rules for APIs
๐Ÿ”น Caching with Docker

Tell me what you want!