Skip to main content
fixing major spelling problems.
Source Link

I reckon that you got quad cores thus 4 processes. Wrong solution. Instead, count number of cores (this way it can be extended to dual cores, quad cores, 6 cores, etc, without manually changing process variable). This example is the best I can in running bash parallel. It is made by dividing the task in how many cores you got, clone script.cfgscript.cfg in how many cores you got, runs them in background. In this way, each core will compress only a part of those folders.

Create a folder let's say bash. Inside, create other 2 folders (backup and directory). Put those 21 folders inside directory. Create 2 files (make .sh file executable).

start.sh

#!/bin/bash date=$(date '+%d_%B_%Y') mkdir backup/$date cores=$(ls /proc/acpi/processor/ | wc -l) cd directory ls > ../list.result cd .. file_lines=$(cat list.result | wc -l) let "split_lines=$file_lines/$cores" let "split_lines=$split_lines+1" split -d --lines=$split_lines list.result CPU for script in `seq 1 $cores` do let sufix=script-1 cat script.cfg > "CPU0$sufix.script" chmod +x CPU0$sufix.script sed -i 's/filenumber/CPU0'$sufix'/g' CPU0$sufix.script ./CPU0$sufix.script & done exit 

and script.cfg

#!/bin/bash date=$(date '+%d_%B_%Y') lenght=$(cat filenumber |wc -l) counter=1 for x in `seq 1 $lenght` do value="directory/"$(sed "${x}!d" filenumber) tar -zcvf $value.tar.gz $value mv directory/*.tar.gz backup/$date done rm *.script CPU* list.result exit 

All files in directory folder will be compressed using all cores in CPU and moved to backup folder. This technique runs all cores at 100% so take care of CPU temperature. I hope I did not made a mistake, if so tell me and i will correct it.

Gabriel http://linux-romania.com

I reckon that you got quad cores thus 4 processes. Wrong solution. Instead, count number of cores (this way it can be extended to dual cores, quad cores, 6 cores, etc, without manually changing process variable). This example is the best I can in running bash parallel. It is made by dividing the task in how many cores you got, clone script.cfg in how many cores you got, runs them in background. In this way, each core will compress only a part of those folders.

Create a folder let's say bash. Inside, create other 2 folders (backup and directory). Put those 21 folders inside directory. Create 2 files (make .sh file executable).

start.sh

#!/bin/bash date=$(date '+%d_%B_%Y') mkdir backup/$date cores=$(ls /proc/acpi/processor/ | wc -l) cd directory ls > ../list.result cd .. file_lines=$(cat list.result | wc -l) let "split_lines=$file_lines/$cores" let "split_lines=$split_lines+1" split -d --lines=$split_lines list.result CPU for script in `seq 1 $cores` do let sufix=script-1 cat script.cfg > "CPU0$sufix.script" chmod +x CPU0$sufix.script sed -i 's/filenumber/CPU0'$sufix'/g' CPU0$sufix.script ./CPU0$sufix.script & done exit 

and script.cfg

#!/bin/bash date=$(date '+%d_%B_%Y') lenght=$(cat filenumber |wc -l) counter=1 for x in `seq 1 $lenght` do value="directory/"$(sed "${x}!d" filenumber) tar -zcvf $value.tar.gz $value mv directory/*.tar.gz backup/$date done rm *.script CPU* list.result exit 

All files in directory folder will be compressed using all cores in CPU and moved to backup folder. This technique runs all cores at 100% so take care of CPU temperature. I hope I did not made a mistake, if so tell me and i will correct it.

Gabriel http://linux-romania.com

I reckon that you got quad cores thus 4 processes. Wrong solution. Instead, count number of cores (this way it can be extended to dual cores, quad cores, 6 cores, etc, without manually changing process variable). This example is the best I can in running bash parallel. It is made by dividing the task in how many cores you got, clone script.cfg in how many cores you got, runs them in background. In this way, each core will compress only a part of those folders.

Create a folder let's say bash. Inside, create other 2 folders (backup and directory). Put those 21 folders inside directory. Create 2 files (make .sh file executable).

start.sh

#!/bin/bash date=$(date '+%d_%B_%Y') mkdir backup/$date cores=$(ls /proc/acpi/processor/ | wc -l) cd directory ls > ../list.result cd .. file_lines=$(cat list.result | wc -l) let "split_lines=$file_lines/$cores" let "split_lines=$split_lines+1" split -d --lines=$split_lines list.result CPU for script in `seq 1 $cores` do let sufix=script-1 cat script.cfg > "CPU0$sufix.script" chmod +x CPU0$sufix.script sed -i 's/filenumber/CPU0'$sufix'/g' CPU0$sufix.script ./CPU0$sufix.script & done exit 

and script.cfg

#!/bin/bash date=$(date '+%d_%B_%Y') lenght=$(cat filenumber |wc -l) counter=1 for x in `seq 1 $lenght` do value="directory/"$(sed "${x}!d" filenumber) tar -zcvf $value.tar.gz $value mv directory/*.tar.gz backup/$date done rm *.script CPU* list.result exit 

All files in directory folder will be compressed using all cores in CPU and moved to backup folder. This technique runs all cores at 100% so take care of CPU temperature.

iI reckon that uyou got quad cores thus 4 processes. Wrong solution. Instead  , count number of cores  ( in thisthis way it can be extended to dual cores, quad cores, 6 cored cores, etc, without manually changing process variable). This example is the best iI can in running bash parallel,It. It is made by dividing the task in how many cores uyou got, clone script.cfg in how many cores uyou got, runs them in background. inIn this way, each core will compress only a part fromof those folders.

makeCreate a folder let's say bashbash. inside make anotherInside, create other 2 folders ( backupbackup and directorydirectory). putPut those 21 folders inside directorydirectory. createCreate 2 files  ( makemake .sh.sh file executable).

start.sh

#!/bin/bash date=$(date '+%d_%B_%Y') mkdir backup/$date cores=$(ls /proc/acpi/processor/ | wc -l) cd directory ls > ../list.result cd .. file_lines=$(cat list.result | wc -l) let "split_lines=$file_lines/$cores" let "split_lines=$split_lines+1" split -d --lines=$split_lines list.result CPU for script in `seq 1 $cores` do let sufix=script-1 cat script.cfg > "CPU0$sufix.script" chmod +x CPU0$sufix.script sed -i 's/filenumber/CPU0'$sufix'/g' CPU0$sufix.script ./CPU0$sufix.script & done exit 

and script.cfg

#!/bin/bash date=$(date '+%d_%B_%Y') lenght=$(cat filenumber |wc -l) counter=1 for x in `seq 1 $lenght` do value="directory/"$(sed "${x}!d" filenumber) tar -zcvf $value.tar.gz $value mv directory/*.tar.gz backup/$date done rm *.script CPU* list.result exit 

all filesAll files in directorydirectory folder will be compressed using all cores in CPU and moved into backup folder. This tehniquetechnique runs all cores at 100% so take care of CPU temperature. iI hope iI did not made a mistake  , if so tell me and i will correct it.

Gabriel http://linux-romania.com

i reckon that u got quad cores thus 4 processes. Wrong solution. Instead  , count number of cores( in this way can be extended to dual cores, quad cores, 6 cored , etc without manually changing process variable. This example is the best i can in running bash parallel,It is made by dividing the task in how many cores u got, clone script.cfg in how many cores u got, runs them in background. in this way, each core will compress only a part from those folders

make a folder let's say bash inside make another 2 folders ( backup and directory) put those 21 folders inside directory create 2 files  ( make .sh file executable)

start.sh

#!/bin/bash date=$(date '+%d_%B_%Y') mkdir backup/$date cores=$(ls /proc/acpi/processor/ | wc -l) cd directory ls > ../list.result cd .. file_lines=$(cat list.result | wc -l) let "split_lines=$file_lines/$cores" let "split_lines=$split_lines+1" split -d --lines=$split_lines list.result CPU for script in `seq 1 $cores` do let sufix=script-1 cat script.cfg > "CPU0$sufix.script" chmod +x CPU0$sufix.script sed -i 's/filenumber/CPU0'$sufix'/g' CPU0$sufix.script ./CPU0$sufix.script & done exit 

and script.cfg

#!/bin/bash date=$(date '+%d_%B_%Y') lenght=$(cat filenumber |wc -l) counter=1 for x in `seq 1 $lenght` do value="directory/"$(sed "${x}!d" filenumber) tar -zcvf $value.tar.gz $value mv directory/*.tar.gz backup/$date done rm *.script CPU* list.result exit 

all files in directory folder will be compressed using all cores in CPU and moved in backup folder. This tehnique runs all cores at 100% so take care of CPU temperature i hope i did not made a mistake  , if so tell me and i will correct it

Gabriel http://linux-romania.com

I reckon that you got quad cores thus 4 processes. Wrong solution. Instead, count number of cores  (this way it can be extended to dual cores, quad cores, 6 cores, etc, without manually changing process variable). This example is the best I can in running bash parallel. It is made by dividing the task in how many cores you got, clone script.cfg in how many cores you got, runs them in background. In this way, each core will compress only a part of those folders.

Create a folder let's say bash. Inside, create other 2 folders (backup and directory). Put those 21 folders inside directory. Create 2 files (make .sh file executable).

start.sh

#!/bin/bash date=$(date '+%d_%B_%Y') mkdir backup/$date cores=$(ls /proc/acpi/processor/ | wc -l) cd directory ls > ../list.result cd .. file_lines=$(cat list.result | wc -l) let "split_lines=$file_lines/$cores" let "split_lines=$split_lines+1" split -d --lines=$split_lines list.result CPU for script in `seq 1 $cores` do let sufix=script-1 cat script.cfg > "CPU0$sufix.script" chmod +x CPU0$sufix.script sed -i 's/filenumber/CPU0'$sufix'/g' CPU0$sufix.script ./CPU0$sufix.script & done exit 

and script.cfg

#!/bin/bash date=$(date '+%d_%B_%Y') lenght=$(cat filenumber |wc -l) counter=1 for x in `seq 1 $lenght` do value="directory/"$(sed "${x}!d" filenumber) tar -zcvf $value.tar.gz $value mv directory/*.tar.gz backup/$date done rm *.script CPU* list.result exit 

All files in directory folder will be compressed using all cores in CPU and moved to backup folder. This technique runs all cores at 100% so take care of CPU temperature. I hope I did not made a mistake, if so tell me and i will correct it.

Gabriel http://linux-romania.com

added 29 characters in body
Source Link

i reckon that u got quad cores thus 4 processes. Wrong solution. Instead , count number of cores( in this way can be extended to dual cores, quad cores, 6 cored , etc without manually changing process variable. This example is the best i can in running bash parallel,It is made by dividing the task in how many cores u got, clone script.cfg in how many cores u got, runs them in background. in this way, each core will compress only a part from those folders

make a folder let's say bash inside make another 2 folders ( backup and directory) put those 21 folders inside directory create 2 files ( make .sh file executable)

start.sh

#!/bin/bash date=$(date '+%d_%B_%Y') mkdir backup/$date cores=$(ls /proc/acpi/processor/ | wc -l) cd directory ls > ../list.result cd .. file_lines=$(cat list.result | wc -l) let "split_lines=$file_lines/$cores" let "split_lines=$split_lines+1" split -d --lines=$split_lines list.result CPU for script in `seq 1 $cores` do let sufix=script-1 cat script.cfg > "CPU0$sufix.script" chmod +x CPU0$sufix.script sed -i 's/filenumber/CPU0'$sufix'/g' CPU0$sufix.script ./CPU0$sufix.script & done exit 

and script.cfg

#!/bin/bash date=$(date '+%d_%B_%Y') lenght=$(cat filenumber |wc -l) counter=1 for x in `seq 1 $lenght` do value="directory/"$(sed "${x}!d" filenumber) tar -zcvf $value.tar.gz $value mv directory/*.tar.gz backup/$date done rm *.script CPU* list.result exit 

all files in directory folder will be compressed using all cores in CPU and moved in backup folder. This tehnique runs all cores at 100% so take care of CPU temperature i hope i did not made a mistake , if so tell me and i will correct it

Gabriel http://linux-romania.com

i reckon that u got quad cores thus 4 processes. Wrong solution. Instead , count number of cores( in this way can be extended to dual cores, quad cores, 6 cored , etc without manually changing process variable. This example is the best i can in running bash parallel,It is made by dividing the task in how many cores u got, clone script.cfg in how many cores u got, runs them in background. in this way, each core will compress only a part from those folders

make a folder let's say bash inside make another 2 folders ( backup and directory) put those 21 folders inside directory create 2 files ( make .sh file executable)

start.sh

#!/bin/bash date=$(date '+%d_%B_%Y') mkdir backup/$date cores=$(ls /proc/acpi/processor/ | wc -l) cd directory ls > ../list.result cd .. file_lines=$(cat list.result | wc -l) let "split_lines=$file_lines/$cores" let "split_lines=$split_lines+1" split -d --lines=$split_lines list.result CPU for script in `seq 1 $cores` do let sufix=script-1 cat script.cfg > "CPU0$sufix.script" chmod +x CPU0$sufix.script sed -i 's/filenumber/CPU0'$sufix'/g' CPU0$sufix.script ./CPU0$sufix.script & done exit 

and script.cfg

#!/bin/bash date=$(date '+%d_%B_%Y') lenght=$(cat filenumber |wc -l) counter=1 for x in `seq 1 $lenght` do value="directory/"$(sed "${x}!d" filenumber) tar -zcvf $value.tar.gz $value mv directory/*.tar.gz backup/$date done rm *.script CPU* list.result exit 

all files in directory folder will be compressed using all cores in CPU and moved in backup folder. This tehnique runs all cores at 100% so take care of CPU temperature i hope i did not made a mistake , if so tell me and i will correct it

Gabriel

i reckon that u got quad cores thus 4 processes. Wrong solution. Instead , count number of cores( in this way can be extended to dual cores, quad cores, 6 cored , etc without manually changing process variable. This example is the best i can in running bash parallel,It is made by dividing the task in how many cores u got, clone script.cfg in how many cores u got, runs them in background. in this way, each core will compress only a part from those folders

make a folder let's say bash inside make another 2 folders ( backup and directory) put those 21 folders inside directory create 2 files ( make .sh file executable)

start.sh

#!/bin/bash date=$(date '+%d_%B_%Y') mkdir backup/$date cores=$(ls /proc/acpi/processor/ | wc -l) cd directory ls > ../list.result cd .. file_lines=$(cat list.result | wc -l) let "split_lines=$file_lines/$cores" let "split_lines=$split_lines+1" split -d --lines=$split_lines list.result CPU for script in `seq 1 $cores` do let sufix=script-1 cat script.cfg > "CPU0$sufix.script" chmod +x CPU0$sufix.script sed -i 's/filenumber/CPU0'$sufix'/g' CPU0$sufix.script ./CPU0$sufix.script & done exit 

and script.cfg

#!/bin/bash date=$(date '+%d_%B_%Y') lenght=$(cat filenumber |wc -l) counter=1 for x in `seq 1 $lenght` do value="directory/"$(sed "${x}!d" filenumber) tar -zcvf $value.tar.gz $value mv directory/*.tar.gz backup/$date done rm *.script CPU* list.result exit 

all files in directory folder will be compressed using all cores in CPU and moved in backup folder. This tehnique runs all cores at 100% so take care of CPU temperature i hope i did not made a mistake , if so tell me and i will correct it

Gabriel http://linux-romania.com

removed spammy link and code formatting improvement
Source Link
sehe
  • 400.3k
  • 49
  • 475
  • 673
Loading
Source Link
Loading