Emmanuel Frécon
by Emmanuel Frécon
2 min read

Influx is a competent time-series database that is often used in IoT projects. As with all other databases, you want to perform regular backups of your data to avoid loosing it. IoT projects tend to collect a lot of data, so planning these backups is usually a good idea.

Influx comes with a command-line client called influxd for backups and restore. While this provides the raw functionality to extract data into snapshots that can be used to restore data later, you are likely to want to keep a number of backups, etc. A good strategy for these backups is to create containers based on the following images using compose files:

  • efrecon/influx-backup will periodically connect to Influx and run the necessary influxd commands to backup some or all of the databases that are present at the server. The image is able to keep only a restricted set of backups (typically the latest ones).
  • instrumentisto/rsync-ssh is a wrapper around rsync that can be used to migrate these backups offsite onto different storage solutions. There are many solutions, including making use of fentas/davfs a docker volume plugin for WebDAV. There are many more options of course!
  • efrecon/dockron is like a cron server for your containers, and it should be able to restart a container like the rsync solution above from time to time to ensure regular copying of the backups generated by the first container.
  • Using different volumes for the local and remote backups is a good way to separate data and facilitates debug if that was necessary.

In order for efrecon/influx-backup to work properly, including collecting the list of databases and time-series that are present at the server, it is necessary to let Influx accept connections on port 8088. This is best achieved through declaring the environment variable INFLUXDB_BIND_ADDRESS when creating the influx container and setting it to When using networks, you can arrange for your backup containers to access port 8088 without exposing the port on the host directly.


There are many ways to synchronise against the major remote storage providers. One promising tool is rclone, with a large portfolio of providers and some initial Docker support. To the mix and when GDPR requirements are stringent, projects such as minio can provide for S3-compatible solutions on premises.