Skip to main content
Next.jsApril 17, 202610 min read

Next.js Caching Layers and Why Most Teams Get Them Wrong

Next.js ships with four distinct caching layers that interact in non-obvious ways. Understanding each layer — and the common misconfigurations that defeat them — is the difference between a fast app and one that confuses developers and wastes infrastructure spend.

Next.js 14 and 15 ship with four caching mechanisms that operate at different points in the request lifecycle. Each serves a distinct purpose. Each can be misconfigured in ways that either eliminate performance gains or cause stale data bugs that are difficult to reproduce. This article documents how each layer works, how they interact, and the failure modes that appear most frequently in production.

The Four Caching Layers

LayerWhere it livesWhat it cachesDefault behaviour
Request MemoizationIn-memory, per-requestfetch() call resultsDeduplicates identical fetches within one render
Data CacheServer-side, persistentfetch() responsesPersists across requests until revalidated
Full Route CacheServer-side, on-diskRendered HTML and RSC payloadsCached at build for static routes
Router CacheClient-side, in-memoryPrefetched RSC payloadsLives for the browser session

They stack. A request first hits the Router Cache (client), then the Full Route Cache (server), then the Data Cache (server), then Request Memoization (per-render), and finally the upstream data source. Invalidating one layer without accounting for the others is the root cause of most "why is this still stale?" bugs.

Request Memoization

Request Memoization is the narrowest cache. It deduplicates identical fetch() calls made during a single server render — same URL, same options. The purpose is to allow multiple components in a tree to fetch the same resource without making multiple HTTP calls.

// Both components call this. Only one HTTP request fires.
async function getUser(id: string) {
  const res = await fetch(`https://api.example.com/users/${id}`);
  return res.json();
}

Common mistake: Assuming memoization persists between requests. It does not. The cache is discarded at the end of each render. Teams sometimes expect repeated page loads to be served from memoization and are confused when upstream call counts remain high. That is the Data Cache's job.

Common mistake: Passing different headers or cache options to the same URL in different components. Two fetch() calls to the same URL with different option objects are treated as separate cache keys and both fire.

Data Cache

The Data Cache persists fetch() responses across requests on the server. It survives process restarts when deployed on infrastructure with a shared file system or an external cache backend (Redis, Memcached). By default, fetch() calls in Server Components are cached indefinitely unless told otherwise.

Opting Out

// Never cached — always fetches fresh
const res = await fetch(url, { cache: "no-store" });

// Cached, revalidated every 60 seconds (stale-while-revalidate)
const res = await fetch(url, { next: { revalidate: 60 } });

On-Demand Revalidation

// app/api/revalidate/route.ts
import { revalidateTag } from "next/cache";

export async function POST(request: Request) {
  const { tag } = await request.json();
  revalidateTag(tag);
  return Response.json({ revalidated: true });
}

Tag your fetches and call this endpoint from your CMS webhook or backend event.

Common mistake: Expecting revalidatePath() to behave synchronously. It schedules revalidation — the next request after the revalidation fires will get fresh data, not the request that triggered the call.

Common mistake on Kubernetes: Running multiple Next.js replicas with no shared cache backend. Each pod maintains its own Data Cache. A revalidation call hits one pod. The other pods continue serving stale data until they independently revalidate. The fix is an external cache adapter (Next.js 15 supports custom CacheHandler implementations) or sticky sessions, which is an antipattern for different reasons. Private DevOps wires up a Redis-backed cache handler as part of every Next.js Kubernetes deployment for this reason.

Full Route Cache

The Full Route Cache stores the rendered HTML and React Server Component payload for statically generated routes. It is written at build time (next build) and served directly without executing any server-side code on subsequent requests.

Routes are static by default if they do not use dynamic functions (cookies(), headers(), searchParams, useSearchParams()). Adding any of these makes the route dynamic — it renders on every request and bypasses the Full Route Cache entirely.

// This makes the entire route dynamic
import { cookies } from "next/headers";

export default async function Page() {
  const cookieStore = cookies();
  // ...
}

Common mistake: Placing a cookie or header read in a deeply nested Server Component and being surprised that the parent route is no longer static. Dynamism propagates upward.

Common mistake: Using generateStaticParams() for a route and expecting on-demand revalidation to update the Full Route Cache. revalidatePath() and revalidateTag() invalidate the Full Route Cache for a given path, but the next render must complete before the new cached version is stored. Under high traffic this can cause a cache stampede — multiple concurrent requests all trigger a re-render because none of them see a valid cache entry yet.

The mitigation is the revalidate option at the route segment level:

// app/posts/[slug]/page.tsx
export const revalidate = 3600; // revalidate at most once per hour

Router Cache

The Router Cache lives in the browser. When a user navigates between pages in a Next.js app, the RSC payload for visited and prefetched routes is stored client-side and reused for subsequent navigations without a server round-trip.

The default cache duration varies by route type:

Route typeDefault cache duration
Static routes5 minutes
Dynamic routes30 seconds

Common mistake: Expecting a form submission or server action to produce immediately visible updates across all routes. If the user navigates to a previously cached route, they may see stale data for up to five minutes. The fix is to call router.refresh() after a mutation, or to use revalidatePath() from a Server Action which automatically invalidates the relevant Router Cache entries on navigation.

"use server";

import { revalidatePath } from "next/cache";

export async function updatePost(id: string, data: FormData) {
  await db.post.update({ where: { id }, data: parseForm(data) });
  revalidatePath(`/posts/${id}`);
}

Common mistake: Assuming router.refresh() busts all cache layers. It refetches the RSC payload for the current route, which invalidates the Router Cache entry for that path. It does not invalidate the Data Cache for other routes or other users' Router Caches.

How the Layers Interact in Practice

A complete cache flush for a single page requires coordinating all four layers:

  1. Invalidate the Data Cache: revalidateTag() or revalidatePath() via a Server Action or API route
  2. The Full Route Cache entry is invalidated by the same revalidatePath() call
  3. The next request to that path re-renders, populating fresh Data Cache and Full Route Cache entries
  4. Client-side Router Cache entries expire on their own schedule, or can be busted by router.refresh()

Teams that only think about one layer — usually Data Cache — end up debugging stale pages that are fresh on one device and stale on another, or fresh in one browser tab and stale in another.

Diagnosing Cache Problems in Production

The most useful signal is the x-nextjs-cache response header:

ValueMeaning
HITServed from Full Route Cache
MISSNot in cache, rendered fresh
STALEServed from cache while revalidation happens in background
SKIPCache bypassed (dynamic route or no-store)

Log this header in your observability stack. A route that should be HIT returning MISS on every request is either dynamic when it should not be, or has a revalidation time so short it never actually caches.

A route returning HIT when it should be MISS means a revalidation is not propagating — check whether you are running multiple replicas without a shared Data Cache backend.

Understanding the four caching layers as a system, rather than as independent features, is the prerequisite for building a Next.js application that performs consistently in production. The interactions are intentional but not always obvious from the documentation. Getting them right once means the application stays fast without ongoing debugging.

Need help with this?

Our team handles this kind of work daily. Let us take care of your infrastructure.