Understanding Cache Control and Policies in Nginx
When you serve content on the web, caching can make or break your performance. A properly configured cache can reduce load on your server, speed up response times, and improve the overall user experience. Nginx is widely used not just as a web server but also as a reverse proxy that handles caching efficiently.
This article walks you through how Nginx caching works, the different cache control mechanisms, and how you can tune your cache policies for different types of content.
What is caching in Nginx
In simple terms, caching is the process of storing responses temporarily so that future requests for the same resource can be served faster. Instead of fetching data from your origin server every time, Nginx can store a copy of the response and serve it directly from memory or disk.
There are two main types of caching in Nginx:
-
Microcaching (FastCGI or Proxy Cache):
Stores full HTTP responses from the upstream or application server for a short duration. -
Browser caching (via headers):
Controls how browsers cache static assets like images, CSS, and JavaScript files.
Enabling caching
Let’s start with the basics. To enable caching, you need to define a cache zone in your Nginx configuration and then use it in a location block.
proxy_cache_path /var/cache/nginx levels=1:2 keys_zone=my_cache:10m max_size=1g inactive=60m use_temp_path=off;
server {
location / {
proxy_pass http://backend;
proxy_cache my_cache;
proxy_cache_valid 200 1h;
proxy_cache_use_stale error timeout updating;
}
}
Here’s what happens:
-
proxy_cache_pathdefines where cached files are stored, how much space they can use, and how long they remain valid. -
keys_zone=my_cache:10mcreates a shared memory zone of 10 MB to store cache keys and metadata. -
proxy_cache_valid 200 1hmeans responses with status 200 are cached for one hour. -
proxy_cache_use_staleallows Nginx to serve old (stale) cached data if the upstream server is temporarily unavailable.
Controlling cache behavior with headers
Headers play a key role in deciding whether content is cached and for how long.
Cache-Control
This header tells clients and proxies how to handle caching. For example:
Cache-Control: public, max-age=3600
-
publicmeans the response can be cached by any cache (browser or proxy). -
max-age=3600tells clients and proxies to consider the response fresh for one hour.
For sensitive data, you can use:
Cache-Control: private, no-store
That prevents the content from being cached by shared proxies.
Expires
This is an older header but still supported. It specifies an explicit expiration date.
Expires: Thu, 31 Oct 2025 07:28:00 GMT
If both Cache-Control and Expires exist, Cache-Control takes precedence.
ETag and Last-Modified
These headers help with conditional requests.
If the content hasn’t changed, the server responds with 304 Not Modified instead of sending the entire response again.
Caching static assets
Static files are perfect candidates for long-term caching. You can set cache headers directly in your Nginx configuration.
location /static/ {
root /var/www/html;
expires 30d;
add_header Cache-Control "public";
}
This tells browsers to cache files under /static/ for 30 days. When you update static assets, a good practice is to version them in filenames (like app.v2.js) so users automatically get the new file.
Bypassing cache
Sometimes you may want to skip caching for certain requests. You can do that based on conditions.
location /api/ {
proxy_pass http://backend;
proxy_no_cache $http_authorization;
proxy_cache_bypass $http_authorization;
}
Here, if an Authorization header is present, the request skips the cache entirely.
Debugging cache behavior
When tuning cache policies, it helps to know whether responses are coming from the cache or the origin server. You can add a custom header to expose that:
add_header X-Cache-Status $upstream_cache_status;
After enabling this, you’ll see one of these values in the response:
- MISS – response was not found in the cache
- BYPASS – cache was bypassed due to a rule
- HIT – response was served from cache
- EXPIRED – cached response was stale and revalidated
- STALE – served old cache while the new one is being updated
Best practices
- Separate cache zones for different applications or traffic types.
- Avoid caching sensitive endpoints like user profiles or payment data.
- Use versioned file names for static assets.
- Monitor cache size and hit ratios to balance speed and resource usage.
- Combine browser caching and proxy caching to get the best performance.
Closing thoughts
Caching in Nginx is not just about speed. It’s about striking the right balance between freshness and performance. A well-tuned cache policy reduces load on your backend, lowers response times, and gives users a snappier experience.
Whether you’re caching API responses for a few seconds or static assets for a month, Nginx gives you full control. Start small, experiment, and observe how caching impacts your system before rolling it out widely.
If you’ve ever struggled with repetitive tasks, obscure commands, or debugging headaches, this platform is here to make your life easier. It’s free, open-source, and built with developers in mind.
👉 Explore the tools: FreeDevTools
👉 Star the repo: freedevtools

