1

I need to run a check on a folder to see when it was last modified. By this I mean the last time it, or any of the files it contains where last modified.

I have tried two ways so far:

  1. using the stat() function on the folder, and then grabbing mtime

    $stat = stat("directory/path/"); echo $stat["mtime"]; 
  2. using the filemtime() function on the folder

    echo (filemtime("directory/path/")); 

Both of these methods return the same value, and this value does not change if I update one of the files. I am guessing this is because the folder structure itself does not change, only the content of one of the files.

I guess I could loop through all the files in the directory and check their modification dates, but there are potentially a lot of files and this doesn't seem very efficient.

Can anyone suggest how I might go about getting a last modification time for a folder and its content in an efficient way?

7
  • Maybe take a look at the SplFileInfo class: php.net/manual/en/class.splfileinfo.php Commented Dec 14, 2013 at 17:42
  • @DirkMcQuickly Using SplFileInfo getMTime() gives me exactly the same result as stat and filemtime Commented Dec 14, 2013 at 17:53
  • does your server run on windows or linux? Commented Dec 14, 2013 at 17:57
  • @DirkMcQuickly its a ubuntu linux server Commented Dec 14, 2013 at 18:01
  • If you need to check on a regular basis, you could keep track of the last file change in a text file (when you change a file, change the text file), and get your value from there. Commented Dec 14, 2013 at 18:10

1 Answer 1

2

moi.

I suggest that you loop all files using foreach function and use it, i think there's no function for that purpose. Here's very simple example using that loop:

$directory = glob('gfd/*'); foreach ($directory as $file) { $mdtime = date('d.m.Y H:i:s', filemtime($file)); } echo "Folder last modified: $mdtime<br />"; 

Keep in mind that foreach is pretty fast, and if you have files < 3000, i think there's nothing to worried about. If you don't want to use this, you can always save modification date to file or something like that. :)

Subfolder-compatibility:

function rglob($pattern, $flags = 0) { $files = glob($pattern, $flags); foreach (glob(dirname($pattern).'/*', GLOB_ONLYDIR|GLOB_NOSORT) as $dir) { $files = array_merge($files, rglob($dir.'/'.basename($pattern), $flags)); } return $files; } 

See this question: php glob - scan in subfolders for a file

Sign up to request clarification or add additional context in comments.

6 Comments

You're not taking sub folders in consideration, though. Make this recursive and I'll +1 you.
Thanks. My main concern is speed an efficiency as this is likely to get a lot of use. I had not thought of using glob().
@h2ooooooo Here you go. ;)
In that case, you can also use the RecursiveDiretoryIterator. php.net/manual/en/class.recursivedirectoryiterator.php
@Finglish glob() is no faster than any other method of looping through all the files. So, this answer adds nothing new to what you stated your question. foreach is indeed fast but is is not foreach which is a bottleneck in this scenario.
|

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.