开发者

Speeding up PHP continuous integration build server on Hudson CI

开发者 https://www.devze.com 2023-01-16 01:45 出处:网络
I\'m trying to speed up my builds some and was looking for some thoughts on how to do so. I currently use Hudson as a continuous integration server for a PHP project.

I'm trying to speed up my builds some and was looking for some thoughts on how to do so. I currently use Hudson as a continuous integration server for a PHP project.

I use an Ant build.xml file to do the build, using a file similar to Sebastian Bergmann's php-hudson-template. At the moment, though (due to som开发者_运维知识库e weird problems with Hudson crashing otherwise), I'm only running phpDocumentor, phpcpd, and phpUnit. phpUnit does generate Clover code-coverage reports, too.

Here are some possible bottlenecks:

  1. phpDocumentor: Takes 180 seconds. There are some large included libraries in my project, such as awsninja, DirectedEdge, oauthsimple, and phpMailer. I'm not sure that I really need to be developing documentation for these. I'm also not sure how to ignore whole subdirectories using my build.xml file.
  2. phpUnit: Takes 120 seconds. This is the only portion of the build that's not run as a parallelTask. The more tests that get written, the longer this time will increase. Really not sure what to do about this, aside from maybe running multiple Hudson build slaves and doling out separate test suites to each slave. But I also have no idea how to go about that, either.
  3. phpcpd: Takes 97 seconds. I'm sure that I can eliminate some parsing and conversion time by ignoring those included libraries. Not sure how to do this in my build.xml file.
  4. My server: Right now I'm using a single Linode server. It seems to get pretty taxed by the whole process.

Any other possible bottlenecks you can think of I'll add to the list.

What are some solutions for reducing my build time?


I'm not a PHP expert at all, but you ought to be able to split your PHPUnit tests onto multiple Hudson slaves if you need to. I would just split your test suite up and run each subset as a separate, parallel Hudson job. If you have a machine with multiple CPUs / cores you can run multiple slaves on it.

One obvious thing you didn't mention - how about just upgrading your hardware, or taking a look at what else is running on the Hudson host and possibly taking up resources ?


  1. phpDocumenter: phpdoc -h reveals the -i option which allows you to specify a comma separated list of files/directories to ignore. This can be added to the arguments tag of your phpdoc build.xml tag

  2. phpUnit: I noticed it can be laggy if I am running tests against a database, but I am not aware of anyway to improve this.

One possible thing that might help would be to not run documenter every time and only run it as part of a build that only happens once a day (or something similar)

I just recently started using these tools and these are few things I discovered.


When we had a similar problem, we resorted to running the documentation in a separate overnight build (along with our functional test scripts in Selenium, as this is also pretty slow). This way, our main CI build wasn't slowed down by generating our API documentation.

However, I note that PHP Documentor has now been updated to version 2, which has significant speed improvements over the slow old version 1. It looks like it's in the region of two to three times faster than v1. This will make a big difference to your CI process. See http://phpdoc.org/ for more info.

Alternatively, you could take a look at apiGen and phpDox, both of which are alternatives to PHPDoc. They are both definitely faster than PHPDoc v1; I haven't compared them with v2 yet.

0

精彩评论

暂无评论...
验证码 换一张
取 消