Unlock seamless workflows and faster delivery with our latest releases – get the details

Migration script for document types

By Knut Melvær

This migration scripts lets you migrate documents to a new type.

migrateDocumentType.ts

import {createClient} from '@sanity/client'

type Doc = {
  _id: string
  _rev?: string
  _type: string
  incomingReferences?: Doc[]
}

const token = process.env.SANITY_TOKEN
const projectId = process.env.SANITY_PROJECT_ID
const dataset = process.env.SANITY_DATASET
const apiVersion = '2023-03-01'

const client = createClient({
  apiVersion,
  projectId,
  dataset,
  token,
})

const OLD_TYPE = 'movie'
const NEW_TYPE = 'film'

const fetchDocuments = () =>
  client.fetch(
    `*[_type == $oldType][0...10] {..., "incomingReferences": *[references(^._id)]{...}}`,
    {oldType: OLD_TYPE}
  )

const buildMutations = (docs: Doc[]) => {
  const mutations: any = []

  docs.forEach((doc) => {
    console.log('movie', doc._id)
    // Updating an document _type field isn't allowed, we have to create a new and delete the old
    const newDocId = `${doc._id}-migrated`
    const newDocument = {...doc, ...{_id: newDocId, _type: NEW_TYPE}}
    delete newDocument.incomingReferences
    delete newDocument._rev

    mutations.push({create: newDocument})
    if (!doc.incomingReferences) {
      return
    }
    // Patch each of the incoming references
    doc.incomingReferences.forEach((referencingDocument) => {
      console.log('ref', referencingDocument._id)
      // ⚠️ We're assuming the field is named the same as the type!
      // There might be another structure involved, perhaps an array, that needs patching
      const updatedReference = {
        [NEW_TYPE]: {
          _ref: newDocId,
          _type: 'reference',
        },
      }
      mutations.push({
        id: referencingDocument._id,
        patch: {
          set: updatedReference,
          unset: [OLD_TYPE],
          ifRevisionID: referencingDocument._rev,
        },
      })
    })

    // Apply the delete mutation after references have been changed
    mutations.push({delete: doc._id})
  })
  return mutations.filter(Boolean)
}

const createTransaction = (mutations: any) => {
  return mutations.reduce((tx: any, mutation: any) => {
    if (mutation.patch) {
      return tx.patch(mutation.id, mutation.patch)
    }
    if (mutation.delete) {
      return tx.delete(mutation.delete)
    }
    if (mutation.create) {
      return tx.createIfNotExists(mutation.create)
    }
  }, client.transaction())
}

const migrateNextBatch: any = async () => {
  const documents = await fetchDocuments()
  if (documents.length === 0) {
    console.log('No more documents to migrate!')
    return null
  }
  const mutations = buildMutations(documents)
  const transaction = createTransaction(mutations)
  await transaction.commit()
  return migrateNextBatch()
}

migrateNextBatch().catch((err: any) => {
  console.error(JSON.stringify(err, null, 2))
  process.exit(1)
})

This example shows how you can perform a migration where a document _type field is changing. It will migrate documents in batches of 10 and continue patching until no more documents are returned from the query.

A few things to note:

  • Changing the _type field on a document isn't allowed. The solution to this is to create a duplicate with a new _id and _type, then delete the old document and patch all referencing documents with the new _id
  • This script will exit if any of the patches on the referencing documents fail due to a revision mismatch (which means the document was edited between fetch => update)
  • The query must eventually return an empty set, or else this script will continue indefinitely
  • There's no guard against losing data on the old, as it might change between fetch and the creation of the new document

Run this script with the command npx sanity exec --with-user-token migrateDocumentType.ts in a studio folder. This script deletes and changes data, so it might be wise to export your dataset first.

Contributor

Other schemas by author