Unlock seamless workflows and faster delivery with our latest releases - Join the deep dive

Running a patch commit to a set of documents at once in Sanity.io using a bulk deletion script.

19 replies
Last updated: Feb 8, 2022
is it possible to run a patch commit to a set of documents at once?We want to update the same type of information to many documents at once via the api.
Feb 7, 2022, 9:46 PM
You can write to a large number of documents in succession by queueing them but you can't make a large number of simultaneous edits because of the API's rate limit .
Feb 7, 2022, 9:52 PM
that amount shoudl be fine. just didn’t know if we could still do it in bulk
Feb 7, 2022, 9:54 PM
that amount shoudl be fine. just didn’t know if we could still do it in bulk
Feb 7, 2022, 9:54 PM
user M
is there any docs on queing?
Feb 7, 2022, 9:56 PM
Yep! Here's an example of a bulk deletion script I'll use:

import { studioClient } from "./studioClient"
import cq from 'concurrent-queue'

// Create a queue to limit the rate at which you write changes to Sanity
let queue = cq().limit({ concurrency: 2 }).process(function (task) {
  return new Promise(function (resolve, reject) {
      setTimeout(resolve.bind(undefined, task), 1000)
  })
})

//We write our query for the document(s) we want to delete
const query = `*[_type == 'category' && !(title in ['Countries', 'Weapons', 'Foods'])]`

const batchDelete = async () => {
  // Use the configured Studio client to fetch our documents
  const docs = await studioClient.fetch(query)
  // Loop through all of the docs returned from our query
  for (const doc of docs){
    queue(doc).then(async () => {
    // delete docs
      studioClient.delete(doc._id)
      .then(() => {
      console.log(`Deleted ${doc._id}`)
      })
      .catch((err) => {
        console.error('Delete failed: ', err.message)
      })
    })
  }
}

batchDelete()

// execute this script by running 
// $ sanity exec ./lib/utils/batchDelete.js --withUserToken
Feb 7, 2022, 9:56 PM
Yep! Here's an example of a bulk deletion script I'll use"
import { studioClient } from "./studioClient"
import cq from 'concurrent-queue'

// Create a queue to limit the rate at which you write changes to Sanity
let queue = cq().limit({ concurrency: 2 }).process(function (task) {
  return new Promise(function (resolve, reject) {
      setTimeout(resolve.bind(undefined, task), 1000)
  })
})

//We write our query for the document(s) we want to delete
const query = `*[_type == 'category' && !(title in ['Countries', 'Weapons', 'Foods'])]`

const batchDelete = async () => {
  // Use the configured Studio client to fetch our documents
  const docs = await studioClient.fetch(query)
  // Loop through all of the docs returned from our query
  for (const doc of docs){
    queue(doc).then(async () => {
    // delete docs
      studioClient.delete(doc._id)
      .then(() => {
      console.log(`Deleted ${doc._id}`)
      })
      .catch((err) => {
        console.error('Delete failed: ', err.message)
      })
    })
  }
}

batchDelete()

// execute this script by running 
// $ sanity exec ./lib/utils/batchDelete.js --withUserToken
Feb 7, 2022, 9:56 PM
There's nothing in Sanity's docs, since the client doesn't have it's own rate-limiting method build in, but I use this package .
Feb 7, 2022, 9:57 PM
ahh okay
Feb 7, 2022, 9:57 PM
thank you!
Feb 7, 2022, 9:57 PM
You're welcome!
Feb 7, 2022, 9:57 PM
user M
random but i can’t seem to get to the async callback using your example. curious if you’ve had that issue?
Feb 8, 2022, 1:31 AM
I haven't! Are you running the exact script I shared?
Feb 8, 2022, 1:36 AM
i tried tweaking it a big
Feb 8, 2022, 1:36 AM
const batchAddSendDate = async (docs) => {
    /*
    * Use the configured Studio client to fetch our documents
    * Loop through all of the docs returned from our query
    */
    
    for (const doc of docs) {
      queue(doc).then(async (dataInQueue) => {
        console.log('doc in loop: ', dataInQueue, doc);
        // client
        //   .patch(doc._id)
        //   .set({
        //     approvalSentDate: 'foo',
        //   })
        //   .commit()
        //   .then(updateDoc => {
        //     console.log('update doc: ', updateDoc)
        //   })
        //   .catch((err) => {
        //     console.log('Failure: ', err)
        //   })
      })
    }
  }

    await client.getDocuments(getApplicantIds()).then((docs) => {
      batchAddSendDate(docs);
    }) 
Feb 8, 2022, 1:37 AM
but it only runs through 2 docs, i’ve passed in like 3-5. Note
getApplicantIds
just returns an array of document
_id
value
Feb 8, 2022, 1:38 AM
Where is
dataInQueue
coming from? The function may not have access to it once it comes up in the queue.
Feb 8, 2022, 1:50 AM
Where is
dataInQueue
coming from? The function may not have access to it once it comes up in the queue.
Feb 8, 2022, 1:50 AM
so that was actually not returning anything
Feb 8, 2022, 1:52 AM
we’d get the 2 docs back from the
queue
but never actually made it to then
then
Feb 8, 2022, 1:52 AM

Sanity– build remarkable experiences at scale

Sanity is a modern headless CMS that treats content as data to power your digital business. Free to get started, and pay-as-you-go on all plans.

Was this answer helpful?