For every line of a file I need to search if an string, containing regular expressions, is found in another file.
The problem is that the files are big, the first is 24MB and the second 115MB. I've tried first $(cat file1) as first argument of grep but it complains for the file size and then I'm trying now with xargs grep but is the same error
If I do a simple string search works
find . -name records.txt | xargs grep "999987^00086" 999987^00086^14743^00061^4 but then if a try to take all the file with cat as argument it fails
find . -name records.txt | xargs grep "$(records_tofix.txt)" -bash: /usr/bin/xargs Argument list too long on grep
bash: records_tofix.txt: command not foundinsteadcommas the canonical UNIX tool for set arithmetic (unions, joins, and differences) on sorted input streams.xargsshould only be used with-0or-d $'\n'arguments (the latter is a GNUism, but it's a necessary GNUism if you want files with one line per record to be unambiguously and correctly parsed).foo baron one line will be treated as two separate records,fooandbar; backslashes, quotes, &c. also get special (shell-like but not-quite-shell-compatible) treatment.find . -name records.txt -exec grep -f records_tofix.txt -- {} +is your friend; no reason to usexargsat all.