开发者

Is it possible to define the execution order in Parallel.For?

开发者 https://www.devze.com 2023-04-09 01:04 出处:网络
// parameters.Count == 10 // actualFreeLicenses == 2 Parallel.For(0, parameters.Count, new ParallelOptions()
// parameters.Count == 10
// actualFreeLicenses == 2
Parallel.For(0, parameters.Count, new ParallelOptions() 
                    {
                        MaxDegreeOfParallelism = actualFreeLicenses
                    }, i =>
                    {
                        ExternalProgram(i);
                    }
);

When I execute the above code I notice that the value of i passed to the ExternalProgram method are 1 & 6, later 2 & 7, later 3 & 8 .开发者_运维知识库..

If I have 14 Parameters and 2 licenses it always launch 1 & 8, later 2 & 9 ...

Is it possible to define order: first 1 & 2, later 3 & 4 etc?


How about using a Queue/ConcurrentQueue and dequeueing items in the body of your parallel loop? This will ensure that ordering is preserved.


If you are using Parallel the order in which they are executed is not of relevance therefore "Parallel". You should use a sequential workflow if the order is relevant for you.


It sounds like you might want to use Parallel.ForEach with a custom partitioner instead - but don't forget that it's not really doing "1 & 6, then 2 & 7" - it's doing (say) 1 on thread 1, 6 on thread 2, then 7 on thread 2, etc. It's not launching pairs of processes as such.

If you want to launch groups of processes, you should probably perform the grouping yourself, then loop over those groups in series, only providing the parallelism within the group, by specifying a maximum degree of parallelization equal to the group size.


If you could switch to using a ForEach (after having generated a range of numbers, perhaps, using IEnumerable.Range), you could use one of the overloads that takes a Partitioner<T>. That third link includes a sample partitioner that provides a single element at a time.


It would appear that the runtime is looking at how many threads you want to use and dividing the workload up into that. e.g. the first thread is working on the first half of the dataset and the second thread is working on the second half of the data.

0

精彩评论

暂无评论...
验证码 换一张
取 消