I’m having the following setup. A development computer in my local network and a production server with a real internet domain. Developing the PHP/CSS files on my dev computer for my theme is handled with git
, I can easily deploy them to the production system. Up until now everything works fine.
But on my dev system I also want to always have a reasonably new database dump from my real system. Now, the quick and dirty way would be do simply write a cronjob to do a mysqldump
on the production server and a mysql < dumpfile
on the dev computer. But since WordPress holds the hostnames in it’s database, this would lead to a non-working dev system, because all links would be wrong.
I know of https://codex.wordpress.org/Moving_WordPress, but the steps there require a lot of manual hick-hack (editing config files, etc.), so this is not good to be used for automation. Well, doing a lot chmod
and sed
and mv
and so on could also be automated, but this feels rather “hacky”.
What would be a correct, clean way of handling such automated DB backups without breaking things?
1 Answer
You can use plugins such as BackupBuddy, but I prefer to script this and use wp-cli, which reads wp-config.php and means you don’t have to worry about mysql credentials.
wp-cli allows you to:
- export the db:
wp db export <filename>
- import the db:
wp db import <filename>
- safe search and replace (including serialised data):
wp search-replace <search-string> <replace-string>
So, I have a script that
- executes
wp export
on a remote server via ssh - copies the sql export file via scp to my local machine
- executes
wp import
locally - executes
wp search-replace
You could then add it to cron, but I prefer to run it manually – takes less than a minute to sync db from production to my local environment.