开发者

Python variable's between Modules/Classes

开发者 https://www.devze.com 2023-02-11 21:41 出处:网络
SiteGrab.py class ThreadManager: bla bla bla bla While True: #### Ask if/unitl all sites are resolved if allsites got = True:

SiteGrab.py

class ThreadManager:
    bla bla bla bla

    While True: #### Ask if/unitl all sites are resolved
        if allsites got = True:
            for i in range(allsites):
                HTML[i].insert(0, "this is a peice of text")
                break


    def GetDNS(self):
        global HTML
        return(HTML)

execute.py

from SiteGrab import *

    manager = ThreadManager()
    manager.start(bla, bla, bla)
    _HTML_ = manager.GetDNS()
    print(_HTML_)

I am trying to load a list of websites html. I am doing this in threads (eg 5 websites = 5 threads). I would like to be notified AS EACH website is finished. In other words, If one of the 5 websites Is going to timeout, I don't want to have to wait for the timeout before I get the other 4 results. I want them to trickle in as they finish.

Here's where I am stuck. I have a GrabSite.py module that开发者_开发百科 sends the results to the Main Module. But the main module must ASK for the results ...

_HTML_ = manager.GetHTML()
print(_HTML_)

... and it can only do so after ...

manager.start(bla, bla, bla) ... has completed. But manager.start(bla, bla, bla) will only complete after the LAST site is resolved.

How can I change this so that the results trickle into execute.py?

Thanks for the help!


I would like to be notified AS EACH website is finished.

By whom? How? This makes very little sense. Do you want it to "bing!" when it's done? Add that to the script.

I want them to trickle in as they finish.

Trickle in to where? A file? A directory?

What does execute.py do with these sites?

Your requirements make little sense.

curl site1 >file1 & curl site2 >file2 & curl site3 >file3 & ...

All five run at the same time. No threads, just OS-level processes using as many OS resources as possible. No waiting. The results trickle in concurrently.

Consider this.

( for site in site1 site2 site3 site4 site5
do
    curl $s | python execute.py &
done
)
echo "Bing!"

This will get all five at the same time. Each one will be piped into standard input for execute.py to process using simple sys.stdin.read().


This is better done using an asynchronous model in a single thread. Twisted framework does that. You can can also use pycurl with its CurlMulti handler. The client.py module can do 1000 requests in a second using it.


Look at the 'multiprocessing' module of Python. It contains several options for doing multiprocessing right (like using a pool of processes). Arbitrary callbacks can be configured e.g. for signaling the availability of a result.

0

精彩评论

暂无评论...
验证码 换一张
取 消