CoursesControlling cached content in Next.jsDemystifying caching in development
Track
Work-ready Next.js

Controlling cached content in Next.js

Log in to watch a video walkthrough of this lesson
Video thumbnail

Set up Next.js so that as you make changes and navigate through the application, you can observe the impact of your cache configuration.

Log in to mark your progress for each Lesson and Task

In recent years, the popularity of the "Jamstack" and Static Site Generators reduced the importance of caching when serving web applications. However, as the limitations of those approaches became more apparent, dynamic, server-rendered responses have once again become popular, spotlighting caching once again.

Next.js 14 not only provided aggressive caching for an application's fetch requests, but it also made it the default.

This led to faster response times at the expense of increased developer frustration. Every fetch request was instantly cached, whether in development or production. Further, this cache is stored in a separate data layer from your site code, so redeploys did not reset the site's state like you may have expected in the Jamstack years.

Next.js 15 has reversed this decision, and caching is opt-in once again. This was likely a difficult decision because there are pitfalls either way.

In this writer's opinion, this decision is not strictly better; it is just different.

It's more important to understand what has been cached and when than whether a request was cached by default.

In short, you will want to specify the caching configuration and be able to observe its results.

Fortunately, a Next.js configuration setting logs the full URL of any fetch request, along with information about whether it was a cache HIT or MISS – and why.

A cache HIT occurs when the requested data is found in the cache, allowing it to be served quickly without fetching from the source.
A cache MISS is the opposite, requiring it to be fetched from the source, which is slower than serving from the cache.

Sanity Client uses fetch under the hood, so once you have enabled this debugging mode below, every query you perform with it will appear in the console.

Update your next.config.mjs with the following configuration
next.config.mjs
/** @type {import('next').NextConfig} */
const nextConfig = {
logging: {
fetches: {
fullUrl: true,
},
},
// ...all other settings
};
export default nextConfig;

Now refresh any page that fetches data from Sanity – like the posts index or an individual post page – and you should see something like the following in your console:

GET /posts 200 in 39ms
│ GET https://q1a918nb.apicdn.sanity.io/v2024-07-24/data/query/production?query=*%5B_type+%3D%3D+%22post%22+%26%26+defined%28slug.current%29%5D%5B0...12%5D%7B%0A++_id%2C+title%2C+slug%0A%7D&returnQuery=false 200 in 5ms (cache hit)

From this, you can observe:

  • The client.fetch() request was for apicdn.sanity.io which means the request was performed with Sanity Client's useCdn set to true.
  • As a cache hit, the response was fulfilled by the Next.js cache, so this request for /posts may not have been sent to Sanity's CDN.

In the previous course – Content-driven web application foundations – these fetches were configured to update at most once every 60 seconds.

  • If a cache hit has already been served within 60 seconds, the response will be fast.
  • If that time has elapsed, the request will still be served stale, expired data – but in the background, the cache will be repopulated so that the next request receives fresh content.
This is similar to the stale-while-revalidate pattern of caching responses

Seeing what is cached is helpful, but it's even better to be able to completely reset the cache during development.

In the following lessons, you'll look at setting up surgical control for revalidating fetches based on time, path, and tag. This is the preferred option for your production web application. But sometimes, in development, you need a hammer.

Create a new API route in your application:
src/app/api/revalidate/all/route.ts
import { revalidatePath } from 'next/cache'
export async function GET() {
if (process.env.NODE_ENV === 'development') {
revalidatePath('/', 'layout')
return Response.json({ message: 'Layout revalidated' })
}
return Response.json({
message: 'This route is configured to only revalidate the layout in development',
})
}
Visit http://localhost:3000/api/revalidate/all and you should see the same message above in your browser.
Visit http://localhost:3000/posts to check that it has worked.

You should see a different log in the terminal that finishes with cache skip:

GET /posts 200 in 893ms
│ GET https://q1a918nb.apicdn.sanity.io/v2024-07-24/data/query/production?query=*%5B_type+%3D%3D+%22post%22+%26%26+defined%28slug.current%29%5D%5B0...12%5D%7B%0A++_id%2C+title%2C+slug%0A%7D&returnQuery=false 200 in 743ms (cache skip)
│ │ Cache skipped reason: (cache-control: no-cache (hard refresh))

Refresh the page again, and the request should once again be a cache hit.

Now you can purge the entire Next.js cache on demand, and observe the caching behavior of every fetch request made in the application.

The two uses of client.fetch in your application currently have the same configuration. This presents an opportunity to make our code more DRY (don't repeat yourself) and set some sensible defaults.

In the next lesson, let's do this and better understand how Sanity and Next.js caching work together.

Courses in the "Work-ready Next.js" track