开发者

Python -- Share a Numpy Array Between Processes?

开发者 https://www.devze.com 2023-03-17 18:54 出处:网络
I\'d like to 开发者_StackOverflow社区use python\'s multiprocessing module to utilize a multi-core Linux server.

I'd like to 开发者_StackOverflow社区use python's multiprocessing module to utilize a multi-core Linux server.

I need all processes to have read/write access to the same shared memory.

Instead of using a list or a queue, is it possible to have a multi-dimentional numpy array as the shared object?


I think I know what you're looking for: https://bitbucket.org/cleemesser/numpy-sharedmem/issue/3/casting-complex-ndarray-to-float-in

There's a short description on the web page saying: A shared memory module for numpy by Sturla Molden and G. Varoquaux that makes it easy to share memory between processes in the form of NumPy arrays. Originally posted to SciPy-user mailing list.

I, myself am using it just that way. Sharing NumPy arrays between processes. Works very well for me.


Look at this. I doesn't seem easy, but it's doable.

Edit: Link rotted, I have linked to another copy.


I found that even if you do not modify your numpy array after fork()'ing a bunch of child processes, you will still see your RAM skyrocket as childprocesses copy-on-write the object for some reason.

You can limit (or totally alleviate?) this problem by setting

"yourArray.flags.writeable = False"

BEFORE fork()'ing/Pool()'ing which seems to keep the RAM used down, and is a LOT less hassle than the other methods :)

0

精彩评论

暂无评论...
验证码 换一张
取 消