Live Preview with Next.js
Collaborate live with live preview for Sanity and Next.js.
Published
Knut Melvær
Head of Developer Community and Education
Today, we're introducing live content authoring previews for Sanity-powered Next.js sites. The fastest, most lightweight preview experience: available today in our new toolkit for Next.js.
Launch one of our new starters on Vercel to give it a quick try, or enjoy the details below. The preview runs entirely client-side, so there's no need to set up special servers or other infrastructure: it just works, right in the browser.
Deploy an e-commerce Next.js starter with live preview →
Jamstack and static sites work great for people on the web. But as always, there are tradeoffs. Even with incremental builds and other optimizations, you may introduce latency from pushed publish until you see changes manifest on your site. Thanks to Next.js’ fallback mode, you can selectively render new content on the fly, but even a page refresh feels long when you are in a creative flow. It also doesn’t play well in collaborative settings or when there’s a lot of connected data changing.
Sanity is built for real-time collaboration from the ground up. The open-source Sanity Studio brings features like Presence to let you see where your team is working when you're in the same document, and Review Changes let you inspect and selectively revert any change made to a document. The Split Pane feature gives you a way to extend the editorial experience with custom previews, with APIs for leveraging the real-time content in the Studio.
Thanks to our open-source technologies like groq-js and Mendoza, we are able to mimic a part of our real-time datastore in the browser. We baked this into the new toolkit for Next.js to let you use the same GROQ expression to fetch data from the backend, to also subscribe to changes when you're authenticated (logged into your Sanity project). Practically, this is exposed as a React Hook that you use in your page template to update the page data coming in from the Next.js data fetching methods.
import ErrorPage from 'next/error'
import {useRouter} from 'next/router'
import {groq} from 'next-sanity'
import {getClient, usePreviewSubscription} from '../../lib/sanity'
const postQuery = groq`
*[_type == "post" && slug.current == $slug][0] {
_id,
title,
excerpt,
content,
coverImage,
"slug": slug.current
}
`
export default function Post({data, preview}) {
const router = useRouter()
if (!router.isFallback && !data.post?.slug) {
return <ErrorPage statusCode={404} />
}
const {data: post} = usePreviewSubscription(postQuery, {
params: {slug: data.post.slug},
initialData: data,
enabled: preview,
})
return (
<div>
<h1>{post.title}</h1>
<p>{post.excerpt}</p>
</div>
)
}
export async function getStaticProps({params, preview = false}) {
const post = await getClient(preview).fetch(postQuery, {
slug: params.slug,
})
return {
props: {
preview,
data: {post},
},
}
}
export async function getStaticPaths() {
const paths = await getClient().fetch(
groq`*[_type == "post" && defined(slug.current)][].slug.current`
)
return {
paths: paths.map((slug) => ({params: {slug}})),
fallback: true,
}
}
Since this is all happening in the browser, there is no need for a dedicated preview server. You can still use references and joins in your GROQ expressions. By using Next.js’ recommended approach of using a serverless function to activate the preview mode, you can make sure that this code only runs for authenticated users. No need for duplicated page templates: just a few lines of code and you're up and running.
The not-so-fine print
We're excited to bring you this toolkit in beta. It hasn't been fully optimized; there is a limit to how many documents in a dataset it can handle before things might start to slow down. The preview comes with a preconfigured limit of 3,000 documents. There also may be other use cases that we haven't considered. Please share anything you discover as a reproducible case on GitHub, or by letting us know in our Slack community. Also free to tell us what's going well for your team and customers!