Batch patch commits to multiple documents via API

19 replies
Last updated: Feb 8, 2022
is it possible to run a patch commit to a set of documents at once?We want to update the same type of information to many documents at once via the api.
AI Update

Yes, you can definitely apply patch operations to multiple documents at once! Sanity provides several approaches depending on your specific needs:

Using Transactions

The most straightforward way to patch multiple documents atomically is using transactions. Transactions allow you to batch multiple patch operations together with all-or-nothing execution:

await client
  .transaction()
  .patch('doc-id-1', p => p.set({ status: 'published' }))
  .patch('doc-id-2', p => p.set({ status: 'published' }))
  .patch('doc-id-3', p => p.set({ status: 'published' }))
  .commit()

This ensures either all patches succeed or none are applied, maintaining data consistency.

Bulk Updates with Batching

For updating many documents (100+), you'll want to batch your operations and respect rate limits. Here's a pattern that works well:

import pThrottle from 'p-throttle';

const throttle = pThrottle({
  limit: 25,  // Max 25 requests per second
  interval: 1000
});

// Query for documents you want to update
const documents = await client.fetch('*[_type == "post" && status == "draft"]');

// Process in batches
const batchSize = 100;
for (let i = 0; i < documents.length; i += batchSize) {
  const batch = documents.slice(i, i + batchSize);
  
  const transaction = client.transaction();
  batch.forEach(doc => {
    transaction.patch(doc._id, p => p.set({ status: 'published' }));
  });
  
  await throttle(() => transaction.commit())();
}

Important Considerations

  • Transaction size: Keep transaction payloads under 500kB
  • Rate limits: Maximum 25 requests per second
  • Field initialization: When using operations like inc(), use setIfMissing() first to ensure the field exists

The patches documentation provides more details on available patch operations like set, unset, insert, and array manipulation.

Show original thread
19 replies
You can write to a large number of documents in succession by queueing them but you can't make a large number of simultaneous edits because of the API's rate limit .
that amount shoudl be fine. just didn’t know if we could still do it in bulk
that amount shoudl be fine. just didn’t know if we could still do it in bulk
user M
is there any docs on queing?
Yep! Here's an example of a bulk deletion script I'll use:

import { studioClient } from "./studioClient"
import cq from 'concurrent-queue'

// Create a queue to limit the rate at which you write changes to Sanity
let queue = cq().limit({ concurrency: 2 }).process(function (task) {
  return new Promise(function (resolve, reject) {
      setTimeout(resolve.bind(undefined, task), 1000)
  })
})

//We write our query for the document(s) we want to delete
const query = `*[_type == 'category' && !(title in ['Countries', 'Weapons', 'Foods'])]`

const batchDelete = async () => {
  // Use the configured Studio client to fetch our documents
  const docs = await studioClient.fetch(query)
  // Loop through all of the docs returned from our query
  for (const doc of docs){
    queue(doc).then(async () => {
    // delete docs
      studioClient.delete(doc._id)
      .then(() => {
      console.log(`Deleted ${doc._id}`)
      })
      .catch((err) => {
        console.error('Delete failed: ', err.message)
      })
    })
  }
}

batchDelete()

// execute this script by running 
// $ sanity exec ./lib/utils/batchDelete.js --withUserToken
Yep! Here's an example of a bulk deletion script I'll use"
import { studioClient } from "./studioClient"
import cq from 'concurrent-queue'

// Create a queue to limit the rate at which you write changes to Sanity
let queue = cq().limit({ concurrency: 2 }).process(function (task) {
  return new Promise(function (resolve, reject) {
      setTimeout(resolve.bind(undefined, task), 1000)
  })
})

//We write our query for the document(s) we want to delete
const query = `*[_type == 'category' && !(title in ['Countries', 'Weapons', 'Foods'])]`

const batchDelete = async () => {
  // Use the configured Studio client to fetch our documents
  const docs = await studioClient.fetch(query)
  // Loop through all of the docs returned from our query
  for (const doc of docs){
    queue(doc).then(async () => {
    // delete docs
      studioClient.delete(doc._id)
      .then(() => {
      console.log(`Deleted ${doc._id}`)
      })
      .catch((err) => {
        console.error('Delete failed: ', err.message)
      })
    })
  }
}

batchDelete()

// execute this script by running 
// $ sanity exec ./lib/utils/batchDelete.js --withUserToken
There's nothing in Sanity's docs, since the client doesn't have it's own rate-limiting method build in, but I use this package .
ahh okay
thank you!
You're welcome!
user M
random but i can’t seem to get to the async callback using your example. curious if you’ve had that issue?
I haven't! Are you running the exact script I shared?
i tried tweaking it a big
const batchAddSendDate = async (docs) => {
    /*
    * Use the configured Studio client to fetch our documents
    * Loop through all of the docs returned from our query
    */
    
    for (const doc of docs) {
      queue(doc).then(async (dataInQueue) => {
        console.log('doc in loop: ', dataInQueue, doc);
        // client
        //   .patch(doc._id)
        //   .set({
        //     approvalSentDate: 'foo',
        //   })
        //   .commit()
        //   .then(updateDoc => {
        //     console.log('update doc: ', updateDoc)
        //   })
        //   .catch((err) => {
        //     console.log('Failure: ', err)
        //   })
      })
    }
  }

    await client.getDocuments(getApplicantIds()).then((docs) => {
      batchAddSendDate(docs);
    }) 
but it only runs through 2 docs, i’ve passed in like 3-5. Note
getApplicantIds
just returns an array of document
_id
value
Where is
dataInQueue
coming from? The function may not have access to it once it comes up in the queue.
Where is
dataInQueue
coming from? The function may not have access to it once it comes up in the queue.
so that was actually not returning anything
we’d get the 2 docs back from the
queue
but never actually made it to then
then

Sanity – Build the way you think, not the way your CMS thinks

Sanity is the developer-first content operating system that gives you complete control. Schema-as-code, GROQ queries, and real-time APIs mean no more workarounds or waiting for deployments. Free to start, scale as you grow.

Was this answer helpful?