4

I'm trying to backup a folder containing several folders and files to a remote location (will be uploading zipped files). Is there any existing scripts that may help me, which checks if the files have been modified after the date of the last backup, and only backs up files created / modified after that?

The current size of the data is around 1gb, and I expect adding 50mb-200mb each month

Also, what would be the best way to extract the state of the files on a specific date?

4
  • 7
    Does it need to be PHP? Do you have any access to a Linux or Windows command line? I'm sure there are solutions in PHP for this, but there are more, more flexible and more stable ones in other languages (like rsync for example). Commented Jun 10, 2010 at 11:38
  • 4
    rsync would definitly work but I was going to suggest using git, mercurial or the likes to get the job done. Commented Jun 10, 2010 at 11:53
  • @Pekka, I'm specifically looking for PHP based solutions, as this needs to run on a shared host. rsync is definitely the best for such things, but I was looking for something that could run seamlessly on shared hosts. Commented Jun 10, 2010 at 12:11
  • @Dogbert Did you find any PHP solution? Commented May 25, 2017 at 8:31

5 Answers 5

3

incremental backup script using php:

http://web4u.mirrors.phpclasses.org/package/4841-PHP-Manage-backup-copies-of-files-.html

Sign up to request clarification or add additional context in comments.

Comments

2
+75

I would use Subversion for this. If you have a shell on the remote system then its easy to do this with a cron job.

If you have a shared host then you could automate this process over sftp/ftp by mounting the remote drive (maybe with fuse) and then run a svn commit via cron job.

Comments

2

You can call/execute rsync from php, rsync is a command that synchronizes to remote directories, as its name implies. The good thing about rsync is that it only adds new resources, send only diffs on updated resource, and deletes anything that is not on the source directory (if you want it too). Note that with this you don't have incremental backups. For that you should use a VCS (Git, SVN or CVS) as stated on other answers.

Here is a step by step rsync+php tutorial for using it from within php

Comments

1

I don't think anything like this exists.

First, you'll need some recursive function to find all the files in a directory and all the sub directories. There are a lot of examples for that problem. The idea is to use the scandir() function recursively

Then for each file found, you'll need to check if the file has been modified since your last backup, and if it has been modified, ad it to the list of files to backup. You could do something like:

if (filemtime($filename) > $last_backup_time) { $files_to_backup[] = $filename; } 

Finally, for each file to backup, you just have to copy() or ftp_put() your archive of modified files.

2 Comments

The RecursiveDirectoryIterator from the SPL is a much nicer way than recursive scandir calls ;) stackoverflow.com/questions/2418068/…
You can also execute rdiff command for that also. See en.wikipedia.org/wiki/Rsync#Variations
1

Instead of rsync you could consider rdiff-backup. Using rsync techniques, it's able to make incremental remote backups. Git is also possible to do this, but the downside is that you can't remove older incrementals from the repository (due to Gits nature).

I don't see why you'd write your own solution, while other excellent solutions already exist.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.