It's not that there are too many files, but that the list of arguments to the grep command is too long. It's a limitation of the execve(2) system call on the combined size of the list of arguments and list of environment variables passed along that call.
On Linux, since 2.6.23, it's an administrative limit that can be raised or lifted using ulimit -s (also sets the limit on the process stack size). So
ulimit -s unlimited
may work for you.
Otherwise, workarounds, most of which already mentioned in other answers consist in splitting the list of arguments so it fits under that limitation, or avoiding passing the list of files to execve.
ls | xargs grep polyhedron
(OK only because the file names only contain digits)
(it's xargs responsibility to split the list and run as many grep commands as necessary so the execve limitation is not reached).
find . -exec grep polyhedron {} +
Same, but this time, find does the splitting.
grep -r polyhedron .
(if your grep supports -r), this time, only 3 args of a few characters passed to grep, it's grep that builds the list of files internally and never passes it to a execve system call.
Some shells have builtin support for it.
With shells where grep is builtin, you wouldn't have the issue since builtins are not executed with a execve system call.
With ksh93, you can use:
command -x grep polyhedron *
And ksh93 will do the splitting.
zsh has the zargs command:
zargs * -- grep polyhedron
To search for more than one word, you can do:
grep -e word1 -e word2 ...
Or
grep 'word1 word2 ...' ...
Or put the list of words in a file, one per line and use
grep -f that-file ...