Live Content API
The Live Content API is perfect for fast-moving events like sports, news, and commerce. Deliver real-time experiences easily and at scale.
The Live Content API allows you to deliver live, dynamic experiences to your users without the complexity and scalability challenges that typically come with building real-time functionality.
With the Live Content API, you can:
- Subscribe to changes and receive notifications whenever documents are created, updated, or deleted.
- Efficiently query for the exact content you need, and only receive updates for that content.
- Scale to handle high volumes of live updates, even during peak traffic periods.
The Live Content API is designed to be easy to use and integrate into your existing application. It provides a simple, intuitive interface for subscribing to content changes and receiving real-time updates.
The API is in early access and we expect it to change and be simplified as we get feedback from the developer community.
This release is therefore mainly for developers who want to experiment, build support for this API in infrastructure or frameworks, or who have a novel use cases they'd like to explore.
The Live Content API will be available on all Sanity plans. It will be tiered based on number of concurrent connections. The official pricing will be available soon, but the intention is to place the limits such that normal self-serve usage on the free and growth plan don't have to worry about running out of connections, while large scale operations with thousands of concurrent users should expect to need an enterprise plan to operate live.
Join the conversation in the #live-by-default channel of the Sanity community Slack. If you are not already a member, here's a button for you:
Simen Svale and Cody Olsen talking through how this API works, a demo implemented in Next.js and some thoughts on how developing live content experiences should look like once this matures.
See it in action
The Live Content API running in a statically generated Next.js blog template powered by Sanity.
The API is available to everyone, but until official launch, you'll have to use the experimental API version in order to access it.
First make sure you have the latest version of the client. If you don't have support for the live
channel built into your client, your client is too old.
npm install @sanity/client@latest
When setting up your @sanity/client
you specify the apiVersion
as 'vX'
to get access to the Live Content API features:
const client = createClient({
projectId: 'your-project-id',
dataset: 'your-dataset',
apiVersion: 'vX',
useCdn: true
})
By setting apiVersion
to 'vX'
, you're opting into the experimental version of the API that includes the live content capabilities. This opt-in is only effective for the specific request and changes nothing in your settings and has no lasting impact on your project.
Here's a high-level overview of how the Live Content API works:
- Every query you make to the content lake will now be returned along with something called sync tags. If you want to keep that content up to date in real time, you need to hold on to those tags.
- Then you can subscribe to a stream of live updates by calling the
client.live.events()
method. This will return an Observable that emits events whenever the content in the dataset changes. - Whenever you receive an event, you can check if any of the event tags match the sync tags from content you want to keep up to date.
- If there is a match, you can refetch the content using the event ID as the
lastLiveEventId
argument in yourclient.fetch
call. This ensures that you always get the latest version of the content from the CDN, avoiding any stale data.
Here is a minimal example running in the console, keeping a single document in sync using sync tags just to show how this works in principle:
import { createClient } from '@sanity/client'
const client = createClient({
projectId: 'your-project-id',
dataset: 'your-dataset',
apiVersion: 'vX',
useCdn: true
})
const query = '*[slug.current == $slug][0]'
const slug = 'were-doing-it-live'
let syncTags = []
function render(lastLiveEventId?: string) {
client.fetch(
query,
{ slug },
{ filterResponse: false, lastLiveEventId }
).then(
(res) => {
// Store the syncTags and "render" the data
syncTags = res.syncTags
const data = res.result
console.log(data)
})
)
}
// Initial render
render()
// Subscribe to live updates
const subscription = client.live.events().subscribe(
(event) => {
if (event.type === 'message' && event.tags.some((tag) => syncTags.includes(tag))) {
// If any of the event tags match our stored syncTags, refetch the data, update our local cache and sync tags and "re-render" it.
render(event.id)
}
if (event.type === 'restart') {
// A restart event is sent when the `lastLiveEventId` we've been given earlier is no longer usable
render()
}
})
// Later, unsubscribe when no longer needed
// subscription.unsubscribe()
In this example:
- We create a Sanity client instance with the necessary configuration.
- We define a query to fetch posts and execute it, setting
filterResponse: false
to get the syncTags along with the result. - We store the returned syncTags and render the initial data.
- We subscribe to live updates using
client.live.events()
. - Whenever an update event is received, we check if any of its tags match our stored syncTags.
- If there's a match, we refetch the data, passing the event ID as
lastLiveEventId
to get the latest version. - We update the stored syncTags and re-render with the fresh data.
- Finally, we unsubscribe from the live updates when no longer needed.
By following this pattern, your application can efficiently keep its content in sync with the latest changes in your Sanity dataset.
Gotcha
Support for sync tags in GraphQL is coming soon.
On the HTTP level, the central piece of this feature is the Invalidation Channel. This is a live API adhering to the Server Sent Events standard that streams sync tags as they become invalid.
GET /<version>/data/live/events/<dataset>
The API does not require authentication. Since tags are opaque, we don’t consider these to pose any risk of information leakage.
last-event-id
: An optional start event ID (”position”).- If provided, it must be a previously returned event ID. The stream will continue with the next event following this position. A client can choose to pick any ID it has seen, but the intent is to allow a client to continue after disconnecting.
- If not given, the stream starts at the end, i.e. only new tags will be returned.
- If not usable (e.g. invalid encoding, or we have truncated the underlying stream so it no longer refers to a valid position, or it is not within a valid range), the first message emitted will be a
restart
message, and the stream will start at the end. - We don’t guarantee the “lifetime” of a position. A client should always be prepared to receive a
restart
message. A client should assume arestart
message invalidates earlier positions.
- 200 OK: If a streaming operation is started successfully.
- 406 Unacceptable: If the
Accept
header does not includetext/event-stream
. - All errors are represented as other standard status code unless they occur after the stream have been started, in which case they are represented as
error
events.
content-type
will betext/event-stream
.
- The response body is a Server-Sent Events stream
- The API may choose to disconnect at any time. It is the client’s responsibility to reconnect and continue from the last position.
Welcome message
event: welcome
Restart
event: restart
Tags
In this message, the id
field is the position.
id: MXxhYVlRejdGZUpPMA
data: {"tags": ["tag1","tag2"]}
Error
event: error
data: {"status": 500, "message": "Internal Server Error"}
Keepalive (an empty comment, as with the listener API)
: