Speeding my script



have a web page calling PERL script that searches for patterns in 20,
000 files + and returns link to files and lines found matching
pattern. I use a call to `find` and `egrep`

Q: Script works - but is straining under the load - files are in the
Gbs.
How to speed process? How simple to employ threads or slitting
off
new processes?

I know i should RTFM (LOL) and I will, but just looking for some
quick guidance/suggestions

pseudo code;

cd root of document directory

Load array with names of directories

forech subdir in @dirnames

cd $subdir
lots of if statements to figure what find command and what
option to use
@temp_array=`$long_find_grep_command`
push @temp_array onto big array
other processing
end foreach

what I'd like to do is to be able to simultaneously be searching more
than 1 subdirectory

TX for your help -


.