3

I need to write a script that will check a directory for a given file type, say *.mov, and move up to, say, 50GB worth into another directory. The purpose being that we have a huge bunch of files that need to go through another automated system, but sending terabytes through at one time will cause a whole bunch of other processes to break.

Is there a way to restrict find to displaying up to a given total size? If so, I could easily find /path/to -iname "*.mov" [halt at 50GB] -exec mv {} /path/to \;

Or does anyone have any other suggestions as to how I can automate this? I suppose I could write a loop that added the size to a variable and halted if the variable got to a certain size, like (I'll worry about the syntax of the filesize test later):

TOTALSIZE=0 for x in /path/to/*.mov do (print file size > $filesize) && TOTALSIZE=`expr $TOTALSIZE + $filesize` if [ $TOTALSIZE > 50000000 ] then exit 0 else mv $x /destination done 

Would that work?

0

1 Answer 1

2

As a matter of fact, I just smashed this out and it seems to work fairly well:

for x in $ORIGIN/*.mov do FILESIZE=`stat -f '%z' "$x"` && TOTALSIZE=`expr $FILESIZE + $TOTALSIZE` if test $TOTALSIZE -ge 2000000 then exit 0 else mv "$x" $DESTINATION fi done 

(I just tested with 2MB rather than 50GB but I assume it'll upscale)

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.