Using Mac OS X command line I want to perform a simple find and replace in a large number of files, within the current directory and its many sub-directories.
I need to perform many replacements so I'd like the script to be as performant as possible.
Whatever I try seems to result in some random error so I'm finally asking for help.
So given I have two variables:
FIND=oldText REPLACE=newText Here's what I've tried so far:
sed -i '' "s/${FIND}/${REPLACE}/g" * > sed: Build: in-place edditing only works for regular expressions Apparently this is trying to sed on directory path's itself so I subsequently tried (to exclude directories from being sed'ed)
find * -type f -print | xargs sed -i '' "s/${FIND}/${REPLACE}/g" > xargs: sed: Argument list too long So because I have such a large list of files to action xargs can't handle it. Apparently -exec is better at large lists..
find * -type f -print -exec sed -i '' "s/${FIND}/${REPLACE}/g" {} \; Now this does actually work HOWEVER sed decides it must correct the missing eof/linefeed in all files in which they are missing, despite there being no replacements in the file. Unfortuntely there are thousands of files of this nature and its not an option for me to be making changes of this magnitude for this current piece of work. (Please don't preach about how I should correct the files, thats not the question I am asking).
So in an attempt to overcome this issue I have tried to first extract the list of files that do indeed contain my ${FIND} term and then only perform the sed on those files...
grep -r -l -e "${FIND}" "." | sed -i '' "s/${FIND}/${REPLACE}/g" > sed: -i may not be used with stdin -
grep -r -l -e "${FIND}" "." | -exec sed -i '' "s/${FIND}/${REPLACE}/g" {} \; > ./file1.txt: line 10: -exec: command not found -
$( grep -r -l -e "${FIND}" "." ) -exec sed -i '' "s/${FIND}/${REPLACE}/g" {} \; > ./file1.txt: line 10: -exec: command not found -
FILEPATHS_CONTAINING_FIND=$( grep -r -l -e "${FIND}" "." ) sed -i '' "s/${FIND}/${REPLACE}/g" "${FILEPATHS_CONTAINING_FIND}" > sed: ./File1.txt ./File2.txt ./File3.txt: No such file or directory I think here the its treating the variable ${FILEPATHS_CONTAINING_FIND} as a single long file path. If I remove the double quotes "" it doesn't handle paths with spaces so that's not an option either. Went back to trying xargs now that the list of files is shorter having filtered...
$( grep -r -l -e "${FIND}" "." ) | xargs sed -i '' "s/${FIND}/${REPLACE}/g" > ./Script.sh: line 10: ./File1.txt: Permission denied Trying sudo in various places makes no difference.
Anyway I've resorted to using this for loop but I'd really rather something more succinct and performant.
IFS=$'\n' # Ensure spaces don't mess up the for loop for FILEPATH_CONTAINING_FIND in $(grep -r -l -e "${FIND}" "."); do sed -i '' "s/${FIND}/${REPLACE}/g" "${FILEPATH_CONTAINING_FIND}" done Can anyone help me with the problems I've experienced above?
-i ''to overwrite files while you still have not got your script working? Overwriting files without backup should only be done when you're pretty confident the whole thing will work. Well, never mind; they're your files — you can do as you please. But sanity dictates that you don't (normally) go around overwriting files until you're sure you're going to do it right.-exec, use{} +instead of{} \;because it makesfindbehave likexargsand run with multiple file names as arguments. The error messagexargs: sed: Argument list too longis pretty weird. Exactly how long are your${FIND}and${REPLACE}strings?grep -r -l … | xargs sed …should deal with the files containing the match. If you have spaces or other characters outside the portable file name character set in the names, say so. It makes your job harder. If you have file names containing newlines, it makes life even harder; that would be important information to have.