I have a Hudson job that just does a check-out/update to a third-party library开发者_开发百科. Call this Job A.
Several other jobs depend on this library. Call them Jobs B and C. They use the stuff checked out by Job A, and need it to be up-to-date.
My question is, how can I require Jobs B and C to always run Job A (to update the library) before they run through their build routine?
If this is not possible, can someone recommend another way to achieve the same effect?
You can do it the other way with "child" jobs. For example, you can configure A to trigger B and C after it has succeeded. (You will find the option on job A configuration page).
If you need more advanced conditions for triggering the child jobs, you can take a look at the Parametrized Trigger plugin.
After thinking about the problem some more, I think I may have been over-complicating things.
Since the library in Job A is rarely updated, we decided it's probably acceptable to just scan SVN on an interval and update when there are changes. There's a small possibility that builds of B and C will miss library changes if they start right after the changes to A were checked in, but that should rarely be an issue.
If I follow you, it sounds like you might need the Join plugin:
This plugin allows a job to be run after all the immediate downstream jobs have completed. In this way, the execution can branch out and perform many steps in parallel, and then run a final aggregation step just once after all the parallel work is finished. The plugin is useful for creating a 'diamond' shape project dependency. This means there is a single parent job that starts several downstream jobs. Once those jobs are finished, a single aggregation job runs
精彩评论