Import and export content > Exporting data and backups

    Exporting data and backups

    Higher tiers of DatoCMS offer the ability to generate nightly copies of your content to your own Amazon S3 buckets, but even if you're on lower plans making offline backups is extremely easy.

    If you are just looking for an easy way to export content directly through the dashboard, you can take a look at some plugins that do exactly that, such as Project Exporter or Spreadsheet Record Exporter.

    For a more programatic approach, here's a quick example script that uses our Content Management API to dump every record into a records.json file:

    import { buildClient } from '@datocms/cma-client-node';
    import fs from 'fs/promises';
    async function main() {
    const client = buildClient({
    apiToken: 'YOUR-FULL-ACCESS-API-KEY',
    environment: 'YOUR-ENVIRONMENT-NAME',
    });
    const itemTypes = await client.itemTypes.list();
    const models = itemTypes.filter((itemType) => !itemType.modular_block);
    const modelIds = models.map((model) => model.id);
    const records = [];
    for await (const record of client.items.listPagedIterator({
    nested: true,
    filter: { type: modelIds.join(',') },
    })) {
    records.push(record);
    }
    const jsonContent = JSON.stringify(records, null, 2);
    await fs.writeFile('backupProduction.json', jsonContent, 'utf8');
    }
    main();

    Here is a simple script that exports all assets, and downloads them locally:

    import { buildClient } from '@datocms/cma-client-node';
    import fetch from 'node-fetch';
    import { writeFile } from 'fs/promises';
    async function downloadImage(url) {
    const response = await fetch(url);
    const buffer = await response.buffer();
    const fileName = new URL(url).pathname.split('/').pop();
    await writeFile('./' + fileName, buffer);
    }
    async function main() {
    const client = buildClient({
    apiToken: 'YOUR-FULL-ACCESS-API-KEY',
    environment: 'YOUR-ENVIRONMENT-NAME',
    });
    const site = await client.site.find();
    for await (const upload of client.uploads.listPagedIterator()) {
    const imageUrl = 'https://' + site.imgix_host + upload.path;
    console.log(`Downloading ${imageUrl}...`);
    downloadImage(imageUrl);
    }
    }
    main();

    You can then add this script into a cron-job and store the result in a S3 bucket.