CoursesControlling cached content in Next.jsCaching Fundamentals
Track
Work-ready Next.js

Controlling cached content in Next.js

Lesson
1

Caching Fundamentals

Next.js has prioritized performance with its caching methods and expects you to configure them. Learn how to integrate the Next.js cache and Sanity CDN for high performance.
Simeon Griggs
Simeon GriggsPrincipal Educator at Sanity
Before starting this Course it may be beneficial to have completed:
Log in to mark your progress for each Lesson and Task

You might not need this course.

The Live Content API, and its simplified implementation with next-sanity, handles all aspects of fetching, rendering, caching and invalidating queries in a few lines of code.

This course will continue to exist to explain the finer details of working with Sanity and the Next.js cache – but our strong recommendation is to use live fetches by default.

Caching is not unique to Next.js or Vercel; it's a common strategy across all programming and comes in many forms. For example, in-memory caching is one approach that stores data in the application's memory for quick access.

When discussing caching for web applications, it typically refers to network requests. When a user makes a request from a web server, the response may be cached in their browser, so subsequent requests for the same page do not need to perform yet another round-trip for the same response.

Similarly, when a web server computes and returns a response, it may be cached on the server so that subsequent requests from many other clients can be fulfilled from its cache – faster than recomputing the same request.

This is where things get tricky. How long should your web server cache that response? If it's too long, your users may be frustrated by being served stale content. If it's too short, too many users may have to wait for the web server to compute responses – and your web server may use too many resources doing so.

In typical web applications, caching is handled by modifying the headers sent with a request.

See the MDN documentation on Cache-Control headers

However, Next.js has framework-specific configuration options to scope and simplify setup. This course will primarily focus on these. The following resources may be valuable additional reading:

Next.js data fetching documentation
Next.js caching documentation

Once you have completed this course, you will:

  • Understand why caching matters based on who is most impacted and how.
  • Integrate requests for Sanity's CDN and API with the built-in Next.js cache, configured with sensible defaults.
  • Observe the impact of – and debug changes to – cache configuration.
  • Revalidate cached requests based on time, path, and tag.
  • Setup GROQ-powered webhooks to perform cache revalidation automatically when documents change.

There's no one-size-fits-all strategy for caching, so a developer team is responsible for fine-tuning their application's caches. Let's consider how different user groups are impacted by the types of caching that can be implemented.

In content-driven web applications, content authors typically want to see the effect of their changes happen immediately. The most reliable way to do this would be to remove all caching from the front end so that every response is freshly created. You could also retrieve content from Sanity's API instead of the CDN to ensure the freshest content is used.

However, this strategy also creates the slowest loading and most expensive operating web applications. Not ideal.

Content authors that would prefer to see fresh content before – or immediately after – publishing are better served by configuring Visual Editing – rather than modifying cache settings in production. Take the Integrated Visual Editing with Next.js course to find out how.

Stakeholders in your business would like to keep the running costs of your web application low and conversions high, so you might think the most aggressive caching strategy would suit them. The fewer requests directed to an API instead of a CDN, the better. The less bandwidth a web server spends computing and fulfilling requests, the better.

See Cloudflare's documentation on how website speed affects conversion rates

However, overly aggressive caching is bound to frustrate your content authors and end-user groups.

The stakeholders mentioned above would also like to see improved conversions from end-users – who expect a mix of fast-loading pages and up-to-date, reliable content. For example, it's no good if a product page loads quickly but the stock level or price information is invalid.

Split requests for long-lived and dynamic parts of the same page. Partial pre-rendering is one solution for the above problem.

As you can see, each group that is majorly impacted by your web application's cache brings a unique point of view. This makes knowing how caching works—and reacting to the changing realities of how your application is used—so important.

Now that you understand the problem space and who is impacted, it's best to equip yourself with the tools required to configure and debug your web application's caching configuration.

Mark lesson as complete
You have 1 uncompleted task in this lesson
0 of 1