开发者

PHP: faster cURL execution

开发者 https://www.devze.com 2023-02-09 02:50 出处:网络
I have an application that uses cURL to grab the contents of several websites. I\'d like to optimize this somehow. Would it be possible to implement a singleton design pattern and somehow feed curl th

I have an application that uses cURL to grab the contents of several websites. I'd like to optimize this somehow. Would it be possible to implement a singleton design pattern and somehow feed curl the URLs I need contents for at certain intervals -- such that I only instantiate it once?

Right now, I setup and destroy connections for each call. Sample code wo开发者_运维百科uld be highly appreciated.

Thanks.


This sounds like unnecessary micro-optimization to me. You'll save a fraction of a microsecond for a process that has to go across the internet to grab a hunk of data from a resource that's already out of your control. If you're simply trying to get the process to run faster, maybe try running multiple downloads in parallel.

Edit: And/or make sure your curl supports compressed content.


A number of possible solutions come to mind. The easiest one is probably to build in some kind of caching mechanism. Store the response on disk, and use that until it becomes stale, then perform a new request to update the cache. That alone should vastly improve your performance. Another way of implementing this would be to use a caching proxy server.

Another option is to simply create a cronjob with wget fetching the needed content every couple of minutes and storing the result on disk. Then just access that content from your application. That way, you'll never have to wait for a request to finish.


Sure, just use curl multi to have requests run in parallel. Look at the example on that page.

As a side note: this has nothing to do with singleton or design patterns. While static allows you to keep a persistent application state between requests in Java, this is not possible in PHP.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号