0

I want to check inside a binary file if it matches a binary pattern.

For that, I'm using clamAV signature database

Exploit.HTML.ObjectType:3:*:3c6f626a65637420747970653d222f2f2f2f2f2f2f2f2f2f2f2f 

I code this to retrieve the hex signature string

signature=$(echo "$line" |awk -F':' '{ print $4 }') 

Moreover I would like to change hex string to binary

tmp=$(echo -n $signature | sed 's/\([0-9A-F]\{2\}\)/\\\\\\x\1/gI' | xargs print) 

Finally I would like to check if my file ( *$raw_file_path* ) matches my binary pattern (now in $tmp)

test_var=$(cat $raw_file_path | grep -U -P "$tmp") 

I don't why it doesn't work.

If you have any idea.

Thanks.

1
  • grep and unix tools in general are not designed to read binary files, as they contain \000 (null) characters as part of their normal data. Unix tools rely on reading lines of data, that are separated with end-of-line characters (\n, or \r\n in Windows filesystems). Null characters void this basic model of processing. You may find a combination options to od that will allow you to do what you want. Good luck. Commented Nov 11, 2012 at 3:55

1 Answer 1

3

How about this?

line=Exploit.HTML.ObjectType:3:*:3c6f626a65637420747970653d222f2f2f2f2f2f2f2f2f2f2f2f printf $(sed 's/.*://;s/\(..\)/\\x\1/g' <<< "$line") 

Which yields:

<object type="//////////// 

You can put the bin output in a variable thus:

printf -v variable $(sed 's/.*://;s/\(..\)/\\x\1/g' <<< "$line") 

Now, please avoid a useless use of cat!

grep -U "$variable" "$raw_file_path" 

is enough. If you want to test the result of grep (and ask grep to be quiet):

if grep -qU "$variable" "$raw_file_path"; then echo "Pattern found" else echo "Pattern not found" fi 
Sign up to request clarification or add additional context in comments.

8 Comments

Well usage of printf and sed: +1! (remark: instead of s/.*:// ... $line you could use bash variable motor and write: $(sed 's/\(..\)/\\x\1/g;' <<<${line##*:}). But thanks, I did'nt read -v option of printf until today! (+2 as this was usefull for me;-) Well, I'll go to bash to see how to use Arrays or even Associatives Arrays with this printf -v. And welcome around there!
@F.Hauri You're right about the parameter expansion. Though I have this habit that when I start forking a sed, I try to get as much of it as I can (when reasonably possible). Besides, my benchmarks prove that in this case, the 100% sed method is almost 1% faster (!) than the sed-parameter expansion method :-). About printf and arrays, there are no options that I know to tell printf to construct an array (as there is, e.g., with the -a option of the read builtin).
thanks! I will use signature=$(echo "$line" | awk -F':' '{ print $4 }') printf -v variable $(sed 's/(..)/\\x\1/g;'' <<< "$signature") grep -U "$variable" "$raw_file_path"
Upto now, I use sed to build a bash array or even associative array, than fork to/from eval : ` declare -A fields;eval "fields=("$(df -PlkT / | sed -ne '/\/$/{s/^[^ \t]*[ \t]\+([^ \t]\+)[ \t]\+([^ \t]\+)[ \t]\+([^ \t]\+)[ \t]\+([^ \t]\+)[ \t]\+.*/[fs]="\1" [tot]="\2" [use]="\3" [ava]="\4"/p}')")"`
@F.Hauri How about: f=( _ fs tot use ava _ ); { read; read _ ${f[@]} _; } < <(df -PlkT /); declare -A fields; for i in ${f[@]}; do [[ $i != _ ]] && fields[$i]=${!i}; done ? (don't know if it's portable though). But eval is evil! :-)
|

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.