47

I tried many scripts for database backup but I couldn't make it. I want to backup my database every hour.
I added files to "/etc/cron.hourly/" folder, changed its chmod to 755, but it didn't run. At least I write my pseudo code.

I would be happy if you can write a script for this operation and tell me what should I do more ? After adding this script file to /etc/cron.hourly/ folder.

  • Get current date and create a variable, date=date(d_m_y_H_M_S)
  • Create a variable for the file name, filename="$date".gz
  • Get the dump of my database like this mysqldump --user=my_user --password=my_pass --default-character-set=utf8 my_database | gzip > "/var/www/vhosts/system/example.com/httpdocs/backups/$("filename")
  • Delete all files in the folder /var/www/vhosts/system/example.com/httpdocs/backups/ that are older than 8 days
  • To the file "/var/www/vhosts/system/example.com/httpdocs/backup_log.txt", this text will be written: Backup is created at $("date")
  • Change the file owners (chown) from root to "my_user". Because I want to open the backup and log files from the "my_user" FTP account.
  • I don't want an email after each cron. >/dev/null 2>&1 will be added.
0

11 Answers 11

109

After hours and hours work, I created a solution like the below. I copy paste for other people that can benefit.

First create a script file and give this file executable permission.

# cd /etc/cron.daily/ # touch /etc/cron.daily/dbbackup-daily.sh # chmod 755 /etc/cron.daily/dbbackup-daily.sh # vi /etc/cron.daily/dbbackup-daily.sh 

Then copy following lines into file with Shift+Ins

#!/bin/sh now="$(date +'%d_%m_%Y_%H_%M_%S')" filename="db_backup_$now".gz backupfolder="/var/www/vhosts/example.com/httpdocs/backups" fullpathbackupfile="$backupfolder/$filename" logfile="$backupfolder/"backup_log_"$(date +'%Y_%m')".txt echo "mysqldump started at $(date +'%d-%m-%Y %H:%M:%S')" >> "$logfile" mysqldump --user=mydbuser --password=mypass --default-character-set=utf8 mydatabase | gzip > "$fullpathbackupfile" echo "mysqldump finished at $(date +'%d-%m-%Y %H:%M:%S')" >> "$logfile" chown myuser "$fullpathbackupfile" chown myuser "$logfile" echo "file permission changed" >> "$logfile" find "$backupfolder" -name db_backup_* -mtime +8 -exec rm {} \; echo "old files deleted" >> "$logfile" echo "operation finished at $(date +'%d-%m-%Y %H:%M:%S')" >> "$logfile" echo "*****************" >> "$logfile" exit 0 

Edit:
If you use InnoDB and backup takes too much time, you can add "single-transaction" argument to prevent locking. So mysqldump line will be like this:

mysqldump --user=mydbuser --password=mypass --default-character-set=utf8 --single-transaction mydatabase | gzip > "$fullpathbackupfile" 
Sign up to request clarification or add additional context in comments.

9 Comments

Nice - but for newbies, you either need to explicitly create the logfile (coz you can't append to a file that doesnt exist) or check for it and create it
What's the rotation like on this, how many daily backups does it keep?
Security warning: Using the --password argument makes your password discoverable by looking up the process command line in /proc. Do not use on a multi-user system unless you've applied the various kernel security patches that prevent users from looking at other users' process information.
stackoverflow.com/questions/20751352/… you can use "mysql --login-path=local -e"
find . -name sql_backup_* -mtime +8 has to be changed to 'db_backup_*' to include the quotes or the find command fails
|
35

Create a script similar to this:

#!/bin/sh -e location=~/`date +%Y%m%d_%H%M%S`.db mysqldump -u root --password=<your password> database_name > $location gzip $location 

Then you can edit the crontab of the user that the script is going to run as:

$> crontab -e 

And append the entry

01 * * * * ~/script_path.sh 

This will make it run on the first minute of every hour every day.

Then you just have to add in your rolls and other functionality and you are good to go.

2 Comments

How about piping the output to gzip to save disk space?
Was having trouble getting this script to run as a cron job on my mac. Added the full paths to the binary files of mysqldump and gzip. This worked great. Without it I got a file ($location) that was 0 in size. gzip never ran. Since I was using gzip as an added feature, my real issue was crontab not recognizing mysqldump path (/usr/local/bin). Hope this helps anyone with this issue. Didn't exactly find it written up like I did here.
7

I got the same issue. But I manage to write a script. Hope this would help.

#!/bin/bash # Database credentials user="username" password="password" host="localhost" db_name="dbname" # Other options backup_path="/DB/DB_Backup" date=$(date +"%d-%b-%Y") # Set default file permissions umask 177 # Dump database into SQL file mysqldump --user=$user --password=$password --host=$host $db_name >$backup_path/$db_name-$date.sql # Delete files older than 30 days find $backup_path/* -mtime +30 -exec rm {} \; #DB backup log echo -e "$(date +'%d-%b-%y %r '):ALERT:Database has been Backuped" >>/var/log/DB_Backup.log 

3 Comments

I got below error mysqldump: Got error: 1045: Access denied for user 'root'@'localhost' (using password: YES) when trying to connect
Could be your username is incorrect. Can you try username and password directly and see.
I fixed. thanks. issue was password contains special character.
3
#!/bin/sh #Procedures = For DB Backup #Scheduled at : Every Day 22:00 v_path=/etc/database_jobs/db_backup logfile_path=/etc/database_jobs v_file_name=DB_Production v_cnt=0 MAILTO="[email protected]" touch "$logfile_path/kaka_db_log.log" #DB Backup mysqldump -uusername -ppassword -h111.111.111.111 ddbname > $v_path/$v_file_name`date +%Y-%m-%d`.sql if [ "$?" -eq 0 ] then v_cnt=`expr $v_cnt + 1` mail -s "DB Backup has been done successfully" $MAILTO < $logfile_path/db_log.log else mail -s "Alert : kaka DB Backup has been failed" $MAILTO < $logfile_path/db_log.log exit fi 

Comments

2

Here is my mysql backup script for ubuntu in case it helps someone.

#Mysql back up script start_time="$(date -u +%s)" now(){ date +%d-%B-%Y_%H-%M-%S } ip(){ /sbin/ifconfig eth0 2>/dev/null|awk '/inet addr:/ {print $2}'|sed 's/addr://' } filename="`now`".zip backupfolder=/path/to/any/folder fullpathbackupfile=$backupfolder/$filename db_user=xxx db_password=xxx db_name=xxx printf "\n\n" printf "******************************\n" printf "Started Automatic Mysql Backup\n" printf "******************************\n" printf "TIME: `now`\n" printf "IP_ADDRESS: `ip` \n" printf "DB_SERVER_NAME: DB-SERVER-1\n" printf "%sBACKUP_FILE_PATH $fullpathbackupfile\n" printf "Starting Mysql Dump \n" mysqldump -u $db_user -p$db_password $db_name| pv | zip > $fullpathbackupfile end_time="$(date -u +%s)" elapsed=$(($end_time-$start_time)) printf "%sMysql Dump Completed In $elapsed seconds\n" printf "******************************\n" 

PS: Rememember to install pv and zip in your ubuntu

sudo apt install pv sudo apt install zip 

Here is how I set crontab by using crontab -e in ubuntu to run every 6 hours

0 */6 * * * sh /path/to/shfile/backup-mysql.sh >> /path/to/logs/backup-mysql.log 2>&1 

Cool thing is it will create a zip file which is easier to unzip from anywhere

Comments

2

Now, copy the following content in a script file (like: /backup/mysql-backup.sh) and save on your Linux system.

 #!/bin/bash export PATH=/bin:/usr/bin:/usr/local/bin TODAY=`date +"%d%b%Y"` DB_BACKUP_PATH='/backup/dbbackup' MYSQL_HOST='localhost' MYSQL_PORT='3306' MYSQL_USER='root' MYSQL_PASSWORD='mysecret' DATABASE_NAME='mydb' BACKUP_RETAIN_DAYS=30 mkdir -p ${DB_BACKUP_PATH}/${TODAY} echo "Backup started for database - ${DATABASE_NAME}" mysqldump -h ${MYSQL_HOST} \ -P ${MYSQL_PORT} \ -u ${MYSQL_USER} \ -p${MYSQL_PASSWORD} \ ${DATABASE_NAME} | gzip > ${DB_BACKUP_PATH}/${TODAY}/${DATABASE_NAME}-${TODAY}.sql.gz if [ $? -eq 0 ]; then echo "Database backup successfully completed" else echo "Error found during backup" exit 1 fi ##### Remove backups older than {BACKUP_RETAIN_DAYS} days ##### DBDELDATE=`date +"%d%b%Y" --date="${BACKUP_RETAIN_DAYS} days ago"` if [ ! -z ${DB_BACKUP_PATH} ]; then cd ${DB_BACKUP_PATH} if [ ! -z ${DBDELDATE} ] && [ -d ${DBDELDATE} ]; then rm -rf ${DBDELDATE} fi fi 

After creating or downloading script make sure to set execute permission to run properly.

$ chmod +x /backup/mysql-backup.sh 

Edit crontab on your system with crontab -e command. Add following settings to enable backup at 3 in the morning.

0 3 * * * root /backup/mysql-backup.sh 

Comments

1

Add the following code to your shell script file. Replace dbname, dbuser and dbpass with your database name, username and password respectively.

#!/bin/sh echo "starting db backup" day="$(date +"%m-%d-%y")" db_backup="mydb_${day}.sql" sudo mysqldump -udbuser -pdbpass --no-tablespaces dbname >/home/${db_backup} echo " backup complete" 

If you want to compress the above backup data, just Replace with the following code.

db_backup="mydb_${day}.gz" sudo mysqldump -udbuser -pdbpass --no-tablespaces dbname | gzip -c >/home/${db_backup} 

If you want to delete files older than 14 days in a folders, use following code.

#!/bin/bash fpath1=/home/ubuntu/mysql/* fpath2=/home/ubuntu/postgsql/* file_path=($fpath1 $fpath2) for i in ${file_path[@]}; do find $i -type d -mtime +13 -exec rm -Rf {} + done 

Comments

0
#!/bin/bash # Add your backup dir location, password, mysql location and mysqldump location DATE=$(date +%d-%m-%Y) BACKUP_DIR="/var/www/back" MYSQL_USER="root" MYSQL_PASSWORD="" MYSQL='/usr/bin/mysql' MYSQLDUMP='/usr/bin/mysqldump' DB='demo' #to empty the backup directory and delete all previous backups rm -r $BACKUP_DIR/* mysqldump -u root -p'' demo | gzip -9 > $BACKUP_DIR/demo$date_format.sql.$DATE.gz #changing permissions of directory chmod -R 777 $BACKUP_DIR 

Comments

0

You might consider this Open Source tool, matiri, https://github.com/AAFC-MBB/matiri which is a concurrent mysql backup script with metadata in Sqlite3. Features:

  • Multi-Server: Multiple MySQL servers are supported whether they are co-located on the same or separate physical servers.
  • Parallel: Each database on the server to be backed up is done separately, in parallel (concurrency settable: default: 3)
  • Compressed: Each database backup compressed
  • Checksummed: SHA256 of each compressed backup file stored and the archive of all files
  • Archived: All database backups tar'ed together into single file
  • Recorded: Backup information stored in Sqlite3 database

Full disclosure: original matiri author.

Comments

0

As a DBA, You must schedule the backup of MySQL Database in case of any issues so that you can recover your databases from the current backup.

Here, we are using mysqldump to take the backup of mysql databases and the same you can put into the script.

[orahow@oradbdb DB_Backup]$ cat .backup_script.sh

#!/bin/bash # Database credentials user="root" password="1Loginxx" db_name="orahowdb" v_cnt=0 logfile_path=/DB_Backup touch "$logfile_path/orahowdb_backup.log" # Other options backup_path="/DB_Backup" date=$(date +"%d-%b-%Y-%H-%M-%p") # Set default file permissions 

Continue Reading .... MySQL Backup

Comments

0

I have prepared a Shell Script to create a Backup of MYSQL database. You can use it so that we have backup of our database(s).

 #!/bin/bash export PATH=/bin:/usr/bin:/usr/local/bin TODAY=`date +"%d%b%Y_%I:%M:%S%p"` ################################################################ ################## Update below values ######################## DB_BACKUP_PATH='/backup/dbbackup' MYSQL_HOST='localhost' MYSQL_PORT='3306' MYSQL_USER='auriga' MYSQL_PASSWORD='auriga@123' DATABASE_NAME=( Project_O2 o2) BACKUP_RETAIN_DAYS=30 ## Number of days to keep local backup copy; Enable script code in end of th script ################################################################# { mkdir -p ${DB_BACKUP_PATH}/${TODAY} echo " ${TODAY}" >> ${DB_BACKUP_PATH}/Backup-Report.txt } || { echo "Can not make Directry" echo "Possibly Path is wrong" } { if ! mysql -u ${MYSQL_USER} -p${MYSQL_PASSWORD} -e 'exit'; then echo 'Failed! You may have Incorrect PASSWORD/USER ' >> ${DB_BACKUP_PATH}/Backup-Report.txt exit 1 fi for DB in "${DATABASE_NAME[@]}"; do if ! mysql -u ${MYSQL_USER} -p${MYSQL_PASSWORD} -e "use "${DB}; then echo "Failed! Database ${DB} Not Found on ${TODAY}" >> ${DB_BACKUP_PATH}/Backup-Report.txt else # echo "Backup started for database - ${DB}" # mysqldump -h localhost -P 3306 -u auriga -pauriga@123 Project_O2 # use gzip.. mysqldump -h ${MYSQL_HOST} -P ${MYSQL_PORT} -u ${MYSQL_USER} -p${MYSQL_PASSWORD} \ --databases ${DB} | gzip > ${DB_BACKUP_PATH}/${TODAY}/${DB}-${TODAY}.sql.gz if [ $? -eq 0 ]; then touch ${DB_BACKUP_PATH}/Backup-Report.txt echo "successfully backed-up of ${DB} on ${TODAY}" >> ${DB_BACKUP_PATH}/Backup-Report.txt # echo "Database backup successfully completed" else touch ${DB_BACKUP_PATH}/Backup-Report.txt echo "Failed to backup of ${DB} on ${TODAY}" >> ${DB_BACKUP_PATH}/Backup-Report.txt # echo "Error found during backup" exit 1 fi fi done } || { echo "Failed during backup" echo "Failed to backup on ${TODAY}" >> ${DB_BACKUP_PATH}/Backup-Report.txt # ./myshellsc.sh 2> ${DB_BACKUP_PATH}/Backup-Report.txt } ##### Remove backups older than {BACKUP_RETAIN_DAYS} days ##### # DBDELDATE=`date +"%d%b%Y" --date="${BACKUP_RETAIN_DAYS} days ago"` # if [ ! -z ${DB_BACKUP_PATH} ]; then # cd ${DB_BACKUP_PATH} # if [ ! -z ${DBDELDATE} ] && [ -d ${DBDELDATE} ]; then # rm -rf ${DBDELDATE} # fi # fi ### End of script #### 

In the script we just need to give our Username, Password, Name of Database(or Databases if more than one) also Port number if Different.

To Run the script use Command as:

sudo ./script.sc 

I also Suggest that if You want to see the Result in a file Like: Failure Occurs or Successful in backing-up, then Use the Command as Below:

sudo ./myshellsc.sh 2>> Backup-Report.log 

Thank You.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.