1

I'm am dumping a mysql wordpress database everyday as a backup. Since i don't want to end up with 365 .sql files after a year, i figured it would be decent only to keep the last 30 days of dump files. Always keep the last 30 and automatically delete the older ones, one a day.

I am looking to program this in bash as part of a cron job. So i already have the part where i dump and send the file to the backup server. I only need to add the counting and deleting the oldest one each day snippet.

Here is what i got (the username and pswd are kept in a .my.cnf file):

now=$(date +'%m-%d-%y') mysqldump -h mysql.server.com my_database | gzip -9 > ${home}/dbBackups/db_backup.sql.gz mv ${home}/dbBackups/db_backup.sql.gz ${home}/dbBackups/${now}_db_backup.sql.gz scp ${home}/dbBackups/${now}_db_backup.sql.gz [email protected]:/home/backup_user/backup.server.com/dbBackups/ 

Does anyone have an idea on how to implement this functionality?

4
  • 1
    your cron task runs on your machine or the remote one? you should have it run on the remote one. otherwise it would be too easy. if you are not the admin of the remote one, a second best solution would be having exact backups local, then use rsync with --delete option to delete remote backups. Commented Jul 1, 2015 at 3:06
  • 1
    You could also create a logrotate definition for the file/folder. See: man 8 logrotate Commented Jul 1, 2015 at 4:15
  • Yeah, both of you guys offer very valuable information. @HuStmpHrrr, i'll definitely try out this option first as it looks like the simplest one and requires less configuration, compared to logrotate, in my case. Commented Jul 1, 2015 at 11:52
  • @DavidC.Rankin, Thanks for the tip, i never used logrotate before and will definitely dig into that for this or other project. Commented Jul 1, 2015 at 11:53

3 Answers 3

7

The standard command to print files older than 30 days are

find <full_path_to_your_dir> -type f -mtime +30 -print 

The standard command to delete files older than 30 days are

find <full_path_to_your_dir> -type f -mtime +30 -delete 

The above command will delete all files older than 30 days.

Sign up to request clarification or add additional context in comments.

3 Comments

This will be part 1 of my 2 parts solution. As is, it looks like it will do the job. Thanks!
The question that came up with this line is this: Since it will run on a cron job without prompting, what happens during the first 30 days where the find command won't find anything to delete? Will it just skip to the next command? Will it freeze the script?
If it doesn't find any files older than 30 days it won't do anything. It will continue to the next command. You don't have to worry about it hanging the script waiting for the find command to do something.
1

The find command as mentioned above is the easiest/cleanest solution. If you want you can also do

old=$(date -d "30 days ago" +'%m-%d-%y') rm ${home}/dbBackups/$"{old}"_db_backup.sql.gz 

You will want to make sure that there is no way to screw up your paths. In fact ${home} is dangerously close to the env var $HOME so you may consider changing it. You could also cron a simple script like that to run daily to remove files from wherever you are scp'ing them.

1 Comment

+1 for the tip about the $home var, although on my script, i declare the path just above the $now var, which i haven't shown here.
0

You all have already been extra helpful. Thank you.

So the version 1 script that I will try will look like that:

homePath="/home/myuser" now=$(date +'%m-%d-%y') mysqldump -h mysql.server.com my_database | gzip -9 > ${homePath}/dbBackups/db_backup.sql.gz mv ${homePath}/dbBackups/db_backup.sql.gz ${homePath}/dbBackups/${now}_db_backup.sql.gz find ${homePath}/dbBackups/ -type f -mtime +30 -delete rsync -e ssh [email protected]:/home/backup_user/backup.server.com/dbBackups/ ${homePath}/dbBackups/ 

Simple enough. Does that sound right to you?

As the version 1 didn't quite work, after minimal fiddling, here is the working script:

homePath="/home/myuser" now=$(date +'%m-%d-%y') mysqldump -h mysql.server.com my_database | gzip -9 > ${homePath}/dbBackups/db_backup.sql.gz mv ${homePath}/dbBackups/db_backup.sql.gz ${homePath}/dbBackups/${now}_db_backup.sql.gz find ${homePath}/dbBackups/ -type f -mtime +30 -delete rsync -a --log-file=${homePath}/rsync.log ${homePath}/dbBackups/ [email protected]:/home/backup_user/backup.server.com/dbBackups/ 

2 Comments

The script didn't work, it was skipping directory .. Probably because i inversed source and target in my rsync command. Also, looks like we can skip the -e ssh part as ssh is the main protocol used to sync with remote addresses.
I'd recommend setting now to date +%F (equivalent to %Y-%m-%d) so that your backups order sensibly without using ls -t.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.