15

I want to watch a directory in Ubuntu 14.04, and when a new file is created in this directory, run a script.

specifically I have security cameras that upload via FTP captured video when they detect motion. I want to run a script on this FTP server so when new files are created, they get mirrored (uploaded) to a cloud storage service immediately, which is done via a script I've already written.

I found iWatch which lets me do this (http://iwatch.sourceforge.net/index.html) - the problem I am having is that iwatch immediately kicks off the cloud upload script the instant the file is created in the FTP directory, even while the file is in progress of being uploaded still. This causes the cloud sync script to upload 0-byte files, useless to me.

I could add a 'wait' in the cloud upload script maybe but it seems hack-y and impossible to predict how long to wait as it depends on file size, network conditions etc.

Whats a better way to do this?

3 Answers 3

19

Although inotifywait was mentioned in comments, a complete solution might be useful to others. This seems to be working:

inotifywait -m -e close_write /tmp/upload/ | gawk '{print $1$3; fflush()}' | xargs -L 1 yourCommandHere 

will run

yourCommandHere /tmp/upload/filename 

when a newly uploaded file is closed

Notes:

  • inotifywait is part of apt package inotify-tools in Ubuntu. It uses the kernel inotify service to monitor file or directory events
  • -m option is monitor mode, outputs one line per event to stdout
  • -e close_write for file close events for files that were open for writing. File close events hopefully avoid receiving incomplete files.
  • /tmp/upload can be replaced with some other directory to monitor
  • the pipe to gawk reformats the inotifywait output lines to drop the 2nd column, which is a repeat of the event type. It combines the dirname in column 1 with the filename in column 3 to make a new line, which is flushed every line to defeat buffering and encourage immediate action by xargs
  • xargs takes a list of files and runs the given command for each file, appending the filename on the end of the command. -L 1 causes xargs to run after each line received on standard input.
Sign up to request clarification or add additional context in comments.

Comments

1

You were close to solution there. You can watch many different events with iwatch - the one that interests you is close_write. Syntax:

iwatch -e close_write <directory_name> 

This of course works only if file's closed when the writing's complete, which, while it's a sane assumption, it's not necessarily a true one (yet often is).

2 Comments

thanks, I did try that as well. also tried with inotifywait -e close_write as a 'wait' in the upload script. unfortunately neither one works as expected. I guess its because the files being uploaded to vsftpd for some reason are not 'closed' or 'close_write' properly. I might try some hack-y stuff like 'ls -al' on the file and wait X number of seconds until the filesize stops changing, then proceed. seems like a horrible solution though
for now I've decided instead of doing what I'd planned and cloud syncing files as soon as they are uploaded to the FTP, I'm running an hourly cron to upload the files to the cloud storage if it doesn't see the files already there. So an acceptable if sub-optimal work-around.
0

Here's another version of reacting to a filesystem event by making a POST request to a given URL.

#!/bin/bash set -euo pipefail cd "$(dirname "$0")" watchRoot=$1 uri=$2 function post() { while read path action file; do echo '{"Directory": "", "File": ""}' | jq ".Directory |= \"$path\"" | jq ".File |= \"$file\"" | curl --data-binary @- -H 'Content-Type: application/json' -X POST $uri || continue done } inotifywait -r -m -e close_write "$watchRoot" | post 

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.