I reckon that you got quad cores thus 4 processes. Wrong solution. Instead, count number of cores (this way it can be extended to dual cores, quad cores, 6 cores, etc, without manually changing process variable). This example is the best I can in running bash parallel. It is made by dividing the task in how many cores you got, clone script.cfgscript.cfg in how many cores you got, runs them in background. In this way, each core will compress only a part of those folders.
Create a folder let's say bash. Inside, create other 2 folders (backup and directory). Put those 21 folders inside directory. Create 2 files (make .sh file executable).
start.sh
#!/bin/bash date=$(date '+%d_%B_%Y') mkdir backup/$date cores=$(ls /proc/acpi/processor/ | wc -l) cd directory ls > ../list.result cd .. file_lines=$(cat list.result | wc -l) let "split_lines=$file_lines/$cores" let "split_lines=$split_lines+1" split -d --lines=$split_lines list.result CPU for script in `seq 1 $cores` do let sufix=script-1 cat script.cfg > "CPU0$sufix.script" chmod +x CPU0$sufix.script sed -i 's/filenumber/CPU0'$sufix'/g' CPU0$sufix.script ./CPU0$sufix.script & done exit and script.cfg
#!/bin/bash date=$(date '+%d_%B_%Y') lenght=$(cat filenumber |wc -l) counter=1 for x in `seq 1 $lenght` do value="directory/"$(sed "${x}!d" filenumber) tar -zcvf $value.tar.gz $value mv directory/*.tar.gz backup/$date done rm *.script CPU* list.result exit All files in directory folder will be compressed using all cores in CPU and moved to backup folder. This technique runs all cores at 100% so take care of CPU temperature. I hope I did not made a mistake, if so tell me and i will correct it.
Gabriel http://linux-romania.com