For this example, we're starting with a blank DatoCMS project, and then progressively add models/records using migrations. We'll first create an Article model with a simple Title field.
With the CLI tool successfully set up, run the following command inside your project:
$ dato new migration 'create article model'Created migrations/1591173668_createArticleModel.js
This will create a file named <TIMESTAMP>_createArticleModel.js
inside a migrations
directory. If the directory does not exist, it will be created as well.
Let's take a look a the content of the migration script:
module.exports = async (client) => {// DatoCMS migration script// For more examples, head to our Content Management API docs:// https://www.datocms.com/docs/content-management-api// Create an Article model:// https://www.datocms.com/docs/content-management-api/resources/item-type/createconst articleModel = await client.itemTypes.create({name: 'Article',apiKey: 'article',});// Create a Title field (required):// https://www.datocms.com/docs/content-management-api/resources/field/createconst titleField = await client.fields.create(articleModel.id, {label: 'Title',apiKey: 'title',fieldType: 'string',validators: {required: {},},appearance: {editor: 'single_line',parameters: {heading: true,},addons: [],},});// Create an Article record:// https://www.datocms.com/docs/content-management-api/resources/item/createconst article = await client.items.create({itemType: articleModel.id,title: 'My first article!',});}
The script exports an async function with a client
argument, which is an already initialized Content Management API client.
The body of the function is already filled with everything we need for this particular example:
If you would like to provide your custom template instead of the default migration template, you can run this command and provide your own:
$ dato new migration 'testing custom templates' --migrationTemplate=./relative-path-to/template.js
$ dato migrate --destination=feature-add-article-model
Here's the result:
Upon execution, the command does the following:
feature-add-article-model
;To track which migrations have already run in a specific environment, the CLI creates a special schema_migration
model. After each migration script completes, it creates a record referencing the name of the script itself.
To verify this behaviour, we can re-run the migrations and see the result. Since the sandbox environment feature-add-article-model
already exists, the tool will fail:
$ dato migrate --destination=feature-add-article-model✖ Creating a fork of `original` called `feature-add-article-model`...Error: `feature-add-article-model` already exists! If you want to runthe migrations inside this existing environment you can add the --inPlace flag.
We can request the tool to run some migrations in an existing sandbox environment passing it as the --source
argument, and specifying the --inPlace
flag:
$ dato migrate --source=feature-add-article-model --inPlaceMigrations will be run in sandbox env `feature-add-article-model`... done!
As you can see, no migration gets executed, as our script has already been run in this environment.
Let's create a new migration to add a new Author model, and an Author field on the article:
$ dato new migration addAuthorsCreated migrations/1591176768_addAuthors.js
Let's replace the content of the new migration file with the following:
module.exports = async (client) => {// Create the `author` modelconst authrModel = await client.itemTypes.create({ name: 'Author', apiKey: 'author' });// Add a `name` field to the `author` modelawait client.fields.create('author', {label: 'Name',apiKey: 'name',fieldType: 'string',validators: {required: {},},});// Add an `author` field to the `article` modelawait client.fields.create('article', {label: 'Author',apiKey: 'author',fieldType: 'link',validators: {itemItemType: { itemTypes: [authorModel.id] },},});// Create an `author` recordconst authorRecord = await client.items.create({itemType: authorModel.id,name: 'Mark Smith',});// Set the `author` field on every existing articleconst allArticles = await client.items.all({ filter: { type: 'article' } },{ allPages: true });for (const article of allArticles) {await client.items.update(article.id, {author: authorRecord.id,});}};
Then run the new script in our existing feature-add-article-model
sandbox environment:
$ dato migrate --source=feature-add-article-model --inPlaceMigrations will be run in sandbox env `feature-add-article-model`...✖ Running 1591176768_addAuthors.js...(node:2207) UnhandledPromiseRejectionWarning: ReferenceError: authorModel is not definedat module.exports (/Users/stefanoverna/dato/migrations-example/migrations/1591176768_addAuthors.js:21:35)
Ouch, seems that there was a typo in our migration script (authrModel
instead of authorModel
):
module.exports = async (client) => {// Create the `author` modelconst authrModel = await client.itemTypes.create({ name: 'Author', apiKey: 'author' });
Since we're working on a sandbox, we can just fix the typo, delete the current sandbox from the DatoCMS UI, fork a new sandbox from the primary environment and re-run the migrations:
$ dato migrate --destination=feature-add-article-model✔ Creating a fork of `original` called `feature-add-article-model`...✔ Creating `schema_migration` model...✔ Running 1591173668_createArticleModel.js...✔ Running 1591176768_addAuthors.js...Done!
It goes without saying that you also need to work on your website/app to adapt it to the changes you just made. We suggest to work on a feature branch, including in the commits also the migration scripts.
All of our APIs and integrations offer a way to point to a sandbox environment instead of of the primary one. As an example, if you're using our GraphQL Content Delivery API, you can explicitely read data from a specific environment using one of the following endpoints instead of the regular ones:
https://graphql.datocms.com/environments/{ENVIRONMENT-NAME}https://graphql.datocms.com/environments/{ENVIRONMENT-NAME}/preview
Once everything is working as expected, we move on and ship everything in production.
After you've run your tests you might need to programmatically delete a sandbox environment.
In this case you can simply run:
$ dato environment destroy sandbox-name