Using scripts to migrate DatoCMS content schema
Play video ยป
For this example, we're starting with a blank DatoCMS project, and then progressively add models/records using migrations.
There are two ways to create a migration script:
By writing it manually, or
By having the CLI automatically generate it for you.
We'll cover both methods in detail below.
Let's create an Article model with a simple Title field. With the CLI tool successfully set up, run the following command inside your project:
This will create a script inside your migrations
directory named <TIMESTAMP>_createArticleModel.js
.
Let's take a look at its content:
1'use strict';2
3/** @param client { import("@datocms/cli/lib/cma-client-node").Client } */4module.exports = async (client) => {5 // DatoCMS migration script6
7 // For more examples, head to our Content Management API docs:8 // https://www.datocms.com/docs/content-management-api9
10 // Create an Article model:11 // https://www.datocms.com/docs/content-management-api/resources/item-type/create12
13 const articleModel = await client.itemTypes.create({14 name: 'Article',15 api_key: 'article',16 });17
18 // Create a Title field (required):19 // https://www.datocms.com/docs/content-management-api/resources/field/create20
21 const titleField = await client.fields.create(articleModel, {22 label: 'Title',23 api_key: 'title',24 field_type: 'string',25 validators: {26 required: {},27 },28 });29
30 // Create an Article record:31 // https://www.datocms.com/docs/content-management-api/resources/item/create32
33 const article = await client.items.create({34 item_type: articleModel,35 title: 'My first article!',36 });37}
The script exports an async function with a client
argument, which is an instance of our Content Management API client.
Incidentally, the body of the function is already filled with everything we need for this particular example, but of course, you can rewrite the migration script to your liking using any method available in our Content Management API to produce the desired results.
If you would like to scaffold new migration scripts from a custom template instead of the default one, feel free to pass the --template
flag. Or, even better, you can add it as a default setting to your profile with the datocms profile:set
command, so that the choice will propagate to every other team member.
To execute the migration, run the following command:
$ npx datocms migrations:run --destination=feature-branch --api-token=<YOUR-API-TOKEN>
Here's the result:
Upon execution, the command does the following:
Forks the primary environment into a new sandbox environment called feature-branch
;
Runs any pending migrations inside the sandbox environment.
To track which migrations have already been run in a specific environment, the CLI creates a special schema_migration
model in your project. After each migration script completes, it creates a record referencing the name of the script itself.
You can configure the name of the model with the --migrations-model
flag, or configure your profile accordingly with the datocms profile:set
command!
To verify that only pending migrations are executed, we can re-run the same command and see the result:
$ npx datocms migrations:run --destination=feature-branch
Migrations will be run in "feature-branch" sandbox environment
Creating a fork of "main" environment called "feature-branch"... ! โบ Error: Environment "feature-branch" already exists! โบ Try this: โบ * To execute the migrations inside the existing environment, run "datocms migrations:run --source=feature-branch --in-place" โบ * To delete the environment, run "datocms environments:destroy feature-branch"
Ouch! The sandbox environment feature-branch
already exists, so the command failed. We can follow the CLI suggestion to re-run the migrations inside the already existing sandbox environment:
$ npx datocms migrations:run --source=feature-branch --in-place
Migrations will be run in "feature-branch" sandbox environment
No new migration scripts to run, skipping operation
As you can see, no migration gets executed, as our script has already been run in this environment!
Remember that you can always add the --json
flag to any CLI command, to get a JSON output, easily parsable by tools like jq
.
Let's create a new migration script to add a new Author model, and an Author field on the article. This time, we're going to use the --autogenerate
flag on the migrations:new
command.
The --autogenerate
flag takes two environment names as arguments:
1$ npx datocms migrations:new --help2
3[...]4
5--autogenerate=<value>6 Auto-generates script by diffing the schema of two environments7
8 Examples:9 * --autogenerate=foo finds changes made to sandbox environment 'foo' and10 applies them to primary environment11 * --autogenerate=foo:bar finds changes made to environment 'foo' and applies12 them to environment 'bar'
So first we need to make a copy of our feature-branch
environment (let's call it with-authors
):
$ npx datocms environments:fork feature-branch with-authors
Creating a fork of "feature-branch" called "with-authors"... done
Then add our Author and Author field to the with-authors
environment using the regular DatoCMS interface, and then run the following command to generate a migration script:
$ npx datocms migrations:new addAuthors --autogenerate=with-authors:feature-branch --api-token=<YOUR-API-TOKEN>
Writing "migrations/1653062813_addAuthors.js"... done
Let's see the result:
1/** @param client { import("@datocms/cli/lib/cma-client-node").Client } */2module.exports = async function (client) {3 const newFields = {};4 const newItemTypes = {};5 const newMenuItems = {};6
7 console.log('Create new models/block models');8
9 console.log('Create model "Author" (`author`)');10 newItemTypes['531556'] = await client.itemTypes.create(11 {12 name: 'Author',13 api_key: 'author',14 all_locales_required: true,15 collection_appearance: 'table',16 },17 { skip_menu_item_creation: 'true' },18 );19
20 console.log('Creating new fields/fieldsets');21
22 console.log('Create Single-line string field "Name" (`name`) in model "Author" (`author`)');23 newFields['2791804'] = await client.fields.create(newItemTypes['531556'], {24 label: 'Name',25 field_type: 'string',26 api_key: 'name',27 validators: { required: {} },28 appearance: {29 addons: [],30 editor: 'single_line',31 parameters: { heading: true },32 type: 'title',33 },34 default_value: '',35 });36
37 console.log('Create Single link field "Author" (`author`) in model "Blog Post" (`blog_post`)');38 newFields['2791806'] = await client.fields.create('810907', {39 label: 'Author',40 field_type: 'link',41 api_key: 'author',42 validators: {43 item_item_type: {44 on_publish_with_unpublished_references_strategy: 'fail',45 on_reference_unpublish_strategy: 'delete_references',46 on_reference_delete_strategy: 'delete_references',47 item_types: [newItemTypes['531556'].id],48 },49 required: {},50 },51 appearance: { addons: [], editor: 'link_select', parameters: {} },52 });53
54 console.log('Finalize models/block models');55
56 console.log('Update model "Author" (`author`)');57 await client.itemTypes.update(newItemTypes['531556'], {58 title_field: newFields['2791804'],59 });60
61 console.log('Manage menu items');62
63 console.log('Create menu item "Authors"');64 newMenuItems['265140'] = await client.menuItems.create({65 label: 'Authors',66 item_type: newItemTypes['531556'],67 });68};
Wow! Thanks CLI, that's a lot of code for free! ๐
If we run the migrations again in our feature-branch
environment, we can verify that the script is indeed working:
1$ npx datocms migrations:run --source=feature-branch --in-place --api-token=<YOUR-API-TOKEN>2
3Migrations will be run in "feature-branch" sandbox environment4
5Running migration "1653062813_addAuthors.js"...6Create new models/block models7Create model "Author" (`author`)8Creating new fields/fieldsets9Create Single-line string field "Name" (`name`) in model "Author" (`author`)10Finalize models/block models11Update model "Author" (`author`)12Manage menu items13Create menu item "Authors"14done
The --autogenerate
flag will not take into account changes made to records and uploads! If you need those, you are required to write your own migration script manually โ or extend the one that the autogeneration tool created for you.
Suppose that you need to make a change to a migration script after running it.
Since we're working on a sandbox, to test the new script, we can simply delete the current sandbox, fork a new one from the primary environment and re-run the migrations again:
1$ npx datocms environments:destroy feature-branch2
3Destroying environment "feature-branch"... done4
5$ npx datocms migrations:run --destination=feature-branch6
7Migrations will be run in "feature-branch" sandbox environment8
9Creating a fork of "main" environment called "feature-branch"... done10Creating "schema_migration" model... done11Running migration "1653061497_createArticleModel.js"... done12Running migration "1653062813_addAuthors.js"... done13Successfully run 2 migration scripts14
15Done!
It goes without saying that you also need to work on your website/app to adapt it to the changes you just made. We suggest working on a Git feature branch, and storing the migrations
directory in the same repo as your frontend code.
All of our APIs and integrations offer a way to point to a sandbox environment instead of the primary one.
For example, if you're using our GraphQL Content Delivery API, you can explicitly read data from a specific environment using one of the following endpoints:
https://graphql.datocms.com/environments/{ENVIRONMENT-NAME}https://graphql.datocms.com/environments/{ENVIRONMENT-NAME}/preview
Once everything is working as expected, we can ship everything to production.
After you've run your tests you might need to programmatically delete a sandbox environment.
In this case you can simply run:
$ npx datocms environments:destroy <SANDBOX-ENVIRONMENT-NAME>
When working with large environments, a fork process can become slow. To address this issue, DatoCMS offers a "fast fork" option that can be up to 20 times faster than a regular fork. However, it's important to note that during the fork process, the source environment will be kept in read-only mode, which means that other users won't be able to make any changes to its content. This is similar to turning on maintenance mode.
To use the fast fork option, you can select it either from the DatoCMS interface or the CLI.
To run a fast fork on the DatoCMS interface, simply select the "Perform a fast fast fork?" option when creating a fork:
On the CLI, both the migrations:run
and the environments:fork
commands support an additional flag for the fast fork option.
To use the fast fork option with the migrations:run
command, you can run the following command:
$ npx datocms migrations:run --destination=new-sandbox-env --fast-fork
Similarly, to use the fast fork option with the environments:fork
command, you can run the following command:
$ npx datocms environments:fork source-env destination-env --fast
The --force
option is used in the CLI to force the start of a fast fork process in any case, even if a user is currently making changes to a record.
To use the --force
option, you can add it to the end of the command like this:
$ npx datocms environments:fork source-env destination-env --fast --force$ npx datocms migrations:run --destination=new-sandbox-env --fast-fork --force
By adding the --force
option to the end of the command, you're telling the CLI to proceed with the fast fork process even if there are users making changes to records.
It's important to use the --force
option with caution, as it can potentially destroy in-progress work made by other users. It's recommended to communicate with other users and coordinate with them before using the --force
option to avoid any issue.
Check out this tutorial on how to migrate your content schema using scripts: