38

I'm trying to add a cronjob in the crontab (ubuntu server) that backups the mysql db.

Executing the script in the terminal as root works well, but inserted in the crontab nothing happens. I've tried to run it each minutes but no files appears in the folder /var/db_backups.

(Other cronjobs work well)

Here is the cronjob:

* * * * * mysqldump -u root -pHERE THERE IS MY PASSWORD --all-databases | gzip > /var/db_backups/database_`date +%d%m%y`.sql.gz

what can be the problem?

6
  • 2
    What if you use full path for mysqldump and gzip? Commented Nov 1, 2013 at 16:39
  • Do you intend this job to run every minute? Commented Nov 1, 2013 at 16:41
  • 1
    @fedorqui i've tried using: /usr/bin/mysqldump /usr/local/gnu/gzip but it's the same. Commented Nov 1, 2013 at 17:02
  • 3
    @JaneAvriette definitely no. i'm just trying running it each minutes for testing purposes Commented Nov 1, 2013 at 17:03
  • Does your password have any special characters like # $ etc? Commented Nov 2, 2013 at 18:21

9 Answers 9

119

You need to escape % character with \

mysqldump -u 'username' -p'password' DBNAME > /home/eric/db_backup/liveDB_`date +\%Y\%m\%d_\%H\%M`.sql 
Sign up to request clarification or add additional context in comments.

8 Comments

I would like to vote +10! No other source than this (many Google results, websites, forum posts) have given me this result, this was the solution for me. Without this answer, I would have searched for hours...
one mistake I made was to leave a space between -p and my password, there should be no space i.e -pPASSWORD
@Basj: what if I searched for hours until I found this answer? mind blown
Very useful your answer. Thanks so much. After to escape % character with \ , it works fine. Thanks @Sandeep
This was my problem as well; if you follow one of the first Google links about "Backing up MySQL on Centos", they do not include the \
|
9

I was trying the same, but I found that dump was created with 0KB. Hence, I got to know about the solution which saved my time.

Command:

0 0 * * * mysqldump -u 'USERNAME' -p'PASSWORD' DATEBASE > /root/liveDB_`date +\%Y\%m\%d_\%H\%M\%S`.sql 

Notes

  1. You can change the time setting as per your requirement. I have set every day in above command.

  2. Make sure you enter your USERNAME, PASSWORD, and DATABASE inside single quote (').

  3. Write down above command in Crontab.

Comments

6

Check cron logs (should be in /var/log/syslog) You can use grep to filter them out.

grep CRON /var/log/syslog

Also you can check your local mail box to see if there are any cron mails

/var/mail/username

You can also set up other receiving mail in you crontab file

[email protected]

8 Comments

I tried grep CRON and it gave me: CMD (/usr/bin/mysqldump -u root -pPASSWORD! --all-databases | gzip > /var/db_backups/database_... so the command is executed but not produces output files
Did you set up cron for root or for other user? Maybe your cron user has no rights to execute mysqldump
I haven't setuppet nothing yet. Just used crontab -e on a blank ubuntu server. How can i setup that option? Thanks
Try to dump as regular user and see if it works, check writing permissions for /var/db_backups/ and executing permissions for /usr/bin/mysqldump en.wikipedia.org/wiki/Chmod
you're right. as user i get: -bash: /var/db_backups/database_011113.sql.gz: Permission denied
|
4

Alternatively you can create a custom command mycommand. To which you can add more options. You must give execute permissions.

It is preferable to have a folder where they store all your backups, in this case using a writable folder "backup" which first create in "your home" for example.

My command in "usr/local/bin/mycommand":

#!/bin/bash MY_USER="your_user" MY_PASSWORD="your_pass" MY_HOME="your_home" case $1 in "backupall") cd $MY_HOME/backup mysqldump --opt --password=$MY_PASSWORD --user=$MY_USER --all-databases > bckp_all_$(date +%d%m%y).sql tar -zcvf bckp_all_$(date +%d%m%y).tgz bckp_all_$(date +%d%m%y).sql rm bckp_all_$(date +%d%m%y).sql;; *) echo "Others";; esac 

Cron: Runs the 1st day of each month.

0 0 1 * * /usr/local/bin/mycommand backupall 

I hope it helps somewhat.

Comments

1

Ok, I had a similar problem and was able to get it fixed.

In your case you could insert that mysqldump command to a script then source the profile of the user who is executing the mysqldump command for eg:

. /home/bla/.bash_profile 

then use the absolute path of the mysqldump command

/usr/local/mysql/bin/mysqldump -u root -pHERE THERE IS MY PASSWORD --all-databases | gzip > /var/db_backups/database_`date +%d%m%y`.sql.gz 

Comments

1

Local Host mysql Backup: 0 1 * * * /usr/local/mysql/bin/mysqldump -uroot -ppassword --opt database > /path/to/directory/filename.sql

(There is no space between the -p and password or -u and username - replace root with a correct database username.)

It works for me. no space between the -p and password or -u and username

Comments

0

Create a new file and exec the code there to dump into a file location and zip it . Run that script via a cron

Comments

0

I am using Percona Server (a MySQL fork) on Ubuntu. The package (very likely the regular MySQL package as well) comes with a maintenance account called debian-sys-maint. In order for this account to be used, the credentials are created when installing the package; and they are stored in /etc/mysql/debian.cnf.

And now the surprise: A symlink /root/.my.cnf pointing to /etc/mysql/debian.cnf gets installed as well.

This file is an option file read automatically when using mysql or mysqldump. So basically you then had login credentials given twice - in that file and on command line. This was the problem I had.

So one solution to avoid this condition is to use --no-defaults option for mysqldump. The option file then won't be read. However, you provide credentials via command line, so anyone who can issue a ps can actually see the password once the backup runs. So it's best if you create an own option file with user name and password and pass this to mysqldump via --defaults-file.

You can create the option file by using mysql_config_editor or simply in any editor.

Running mysqldump via sudo from the command line as root works, just because sudo usually does not change $HOME, so .my.cnf is not found then. When running as a cronjob, it is.

Comments

-5

You might also need to restart the service to load any of your changes.

service cron restart 

or

/etc/init.d/cron restart 

2 Comments

when you crontab -e and save, cron will install new crontab automatically, restart is not needed
restarting crontab is not required and is actually misleading for the issue mentioned above

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.