How can I need to submit many jobs in parallel (say batch of ten each) and wait for them to complete and then resubmit next ten so on....?
array=( $(ls -1 window/*realign_win*.txt ) ) ;
echo ${#array[@]};
#for e in ${!array[*]}
for (( e=0; e<="${#array[@]}"; e++ ))
do
# echo "$e" ;
for n in {0..9开发者_如何学Go}
do
if [[ $e -gt ${#array[@]} ]]
then
echo $e, ${#array[*]};
break;
else
echo $e, ${#array[*]};
j=$(($e+$n)) ;
echo "didel-1.01-linux-64bit --analysis indels --doDiploid --bamFile $i --ref Homo_sapiens.GRCh37.62.fa --varFile ${array[$j]} --libFile ${i}_didel_output.libraries.txt --outputFile ${array[$j]}.didel_stage3" ;
#e=$(($e+1)) ;
echo $e;
fi
done &
wait
done
done
Please give suggestions to make it work, thanks in advance
Parallel::ForkManager can be used to parallelize work in Perl, and can be told to limit the maximum number of simultaneous workers.
use Parallel::ForkManager qw( );
use constant MAX_WORKERS => 10;
my $pm = Parallel::ForkManager->($MAX_PROCESSES);
for my $item (@work) {
my $pid = $pm->start() and next;
...
$pm->finish();
}
threads and forks provides an alternate interface. The choice of modules affect whether threads or child processes are used.
use threads; # or: use forks;
use Thread::Queue qw( );
use constant MAX_WORKERS => 10;
my $q = Thread::Queue->new();
my @threads;
for (1..MAX_WORKERS) {
push @threads, async {
while (my $item = $q->dequeue()) {
...
}
};
}
$q->enqueue(@work);
$q->enqueue(undef) for 1..@threads;
$_->join() for @threads;
Still using a shell script, something like this.
#!/bin/sh
jobs=`jot 25`
echo $jobs
set -- $jobs
while [ $# -gt 1 ]; do
pids=""
for i in `jot 10`; do
[ $# == 0 ] && break
job=$1
shift
echo start $job && sleep 3 && echo finish $job &
pids="$pids $!"
done
wait $pids
done
echo done
EDIT Change continue
to break
. Thanks @glennjackman
精彩评论