开发者

Python splitting big lists into smaller lists for check performance?

开发者 https://www.devze.com 2023-03-18 05:45 出处:网络
What is开发者_运维知识库 the best way to split a large list (over 1000 items) for performance purposes? And use \"parts\" of a large list?

What is开发者_运维知识库 the best way to split a large list (over 1000 items) for performance purposes? And use "parts" of a large list?

For example; I have a list of 10k addresses I want to scan. How can I split this list into a 4th, and finish the list by sections to increase performance?


If you're reading all elements of the list, splitting will actually decrease your performance. You should profile your program (maybe post it in a question here, if you can sufficiently simplify it).

If you're not reading all elements, why are you using a list in the first place? Use a set or dict. Operations on both are extremely fast.


See the answers for this question for how to split an arbitrary list into smaller chunks. It doesn't explicitly cover increasing performance, so you might want to consider looking into the multiprocessing module for actually manipulating those small lists simultaneously if that's what you're looking for.


So you want to do an operation on the whole list, but it's slow doing it sequentially? I think you want a worker pool. Does this help?

0

精彩评论

暂无评论...
验证码 换一张
取 消