Background: I need to get updated dat开发者_Python百科a for all of my users. The data resides on a secure site so the script needs to login (using cookies) Traverses some inner URLs and then fetch the required data.
Tools: WWW::Mechanize or Curl
What is the best tool for my needs? Performance is a big issue I need to get the updated data as fast as possible due to the reason that I need to get updated data to lots of users.
Is it possible to fire up multiple requests using the WWW::Mechanize library?
Update:
I got it running using Curl. But I was thinking that I could speed it up using Mechanize. Which library performs better regarding HTTP req? Are there any statistics? Right now i am using Curl with the multi interface.
WWW::Mechanize is a perl module. Therefore you can use all power of the language with it, e.g. fork multiple processes.
精彩评论