Sync data between servers

Enonic version: 6.11.1
OS: Linux


We have two servers: staging and production and would like to synchronize data between them somehow. We have 18000+ objects on servers, but 12000+ of them can be skipped, as they are taken from an external source via import. This lowers the import a lot. All this objects are located under the same folder.

We were reading docs for the toolbox cli, but it seems like there is no possibility to exclude some objects and import the rest.

Our current idea was to create some bash script, which will:

  1. go to the production server
  2. create export(dump) via toolbox cli
  3. copy this export(dump) to our staging server
  4. import all data to our staging server

The main question is there any best practices or advises on how to implement this? Also, is this a nice solution to make a lot of exports via toolbox cli(to skip lots of objects, exporting one node after another) or is there some specific function that currently exist or will be available soon?

Thank you!

Hi! Dump means backing up entire repositories. Export supports extracting substructures as you can see from the documentation here:

If you wish to create scripts that automate parts of this process I suggest you have a look at the data toolbox app: It supports exporting parts of the content tree and then re-importing. Everything is open source.

1 Like

Export only exports the current version of content, not the entire history.
So I would run export on what you need from the prod environment.
This will end up in a folder on the prod environment.
Then I would use rsync to only copy whats new onto the staging server.
And the run import on the staging server.

Depending on your needs you might want to wipe all content from the staging server before importing.