开发者

PHP cURL multi_exec delay between requests

开发者 https://www.devze.com 2023-03-26 13:53 出处:网络
If I run a standard cURL_multi_exec function (example below), I get all cURL handles requested at once. I would like to put a delay of 100ms between each request, is there a way to do that? (nothing f

If I run a standard cURL_multi_exec function (example below), I get all cURL handles requested at once. I would like to put a delay of 100ms between each request, is there a way to do that? (nothing found on Google & StackOverflow search)

I've tried usleep() before curl_multi_exec() which slows down the script but does not postpone each request.

// array of curl handles & results
$curlies = array();
$result = array();
$mh = curl_multi_init();

// setup curl request开发者_如何学运维s
for ($id = 0; $id <= 10; $id += 1) {
    $curlies[$id] = curl_init();
    curl_setopt($curlies[$id], CURLOPT_URL,            "http://google.com");
    curl_setopt($curlies[$id], CURLOPT_HEADER,         0);
    curl_setopt($curlies[$id], CURLOPT_RETURNTRANSFER, 1);
    curl_multi_add_handle($mh, $curlies[$id]);
}

// execute the handles
$running = null;
do {
    curl_multi_exec($mh, $running);
} while($running > 0);

// get content and remove handles
foreach($curlies as $id => $c) {
    $result[$id] = curl_multi_getcontent($c);
    curl_multi_remove_handle($mh, $c);
}

// all done
curl_multi_close($mh);

I'm working on this all day, any help would be greatly appreciated! Thank you.

EDIT: Any other non-cUrl method? That would also answer my question.


Yes, this is possible. If you use the ParallelCurl library, you can much more easily add your 100ms delay with usleep(), as you can make a separate request to be added to the download queue.

for ($urls as $url) {
    $pcurl->startRequest($url);
    usleep(100000);
}


Don't think you can. If you run this from the cli, you could instead fork your script into 10 processes and then fire regular curl requests from each. That would allow you fine grained control over the timing.


PHP is not solution for that. Forking the script won't help too. In the beginnings yes but once you have a little bit more websites you need to grab like that you will find yourself as your sever very, very red. In terms of costs and in terms of script stability you should reconsider using some other idea.

You can do that with Python easily and in case of non-blocking real time calls to API endpoints you should use stuff like Socket.IO + Node.JS or just Node.JS or well, huh... lol

In case that you do not have time nor will you can use stuff like this:

http://framework.zend.com/manual/en/zendx.console.process.unix.overview.html

It actually all depends on what are you trying to achieve.


You can try this:
Store a timestamp in the DB, add one handle and call to curl_multi_exec.
Use CURLOPT_PROGRESSFUNCTION to check timings and add more handles when you need it.
Here Daniel Stenberg (author of cURL and libcurl) says it's possible to add more handles after executing curl_multi_exec.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号