开发者

Asynchronous background processes in Python?

开发者 https://www.devze.com 2022-12-24 20:08 出处:网络
I have been using this as a reference, but not able to accomplish exactly what I need: Calling an external command in Python

I have been using this as a reference, but not able to accomplish exactly what I need: Calling an external command in Python

I also was reading this: http://www.python.org/dev/peps/pep-3145/

For our project, we have 5 svn checkouts that need to update bef开发者_高级运维ore we can deploy our application. In my dev environment, where speedy deployments are a bit more important for productivity than a production deployment, I have been working on speeding up the process.

I have a bash script that has been working decently but has some limitations. I fire up multiple 'svn updates' with the following bash command:

(svn update /repo1) & (svn update /repo2) & (svn update /repo3) &

These all run in parallel and it works pretty well. I also use this pattern in the rest of the build script for firing off each ant build, then moving the wars to Tomcat.

However, I have no control over stopping deployment if one of the updates or a build fails.

I'm re-writing my bash script with Python so I have more control over branches and the deployment process.

I am using subprocess.call() to fire off the 'svn update /repo' commands, but each one is acting sequentially. I try '(svn update /repo) &' and those all fire off, but the result code returns immediately. So I have no way to determine if a particular command fails or not in the asynchronous mode.

import subprocess

subprocess.call( 'svn update /repo1', shell=True )
subprocess.call( 'svn update /repo2', shell=True )
subprocess.call( 'svn update /repo3', shell=True )

I'd love to find a way to have Python fire off each Unix command, and if any of the calls fails at any time the entire script stops.


Don't use shell=True. It will needlessy invoke the shell to call your svn program, and that will give you the shell's return code instead of svn's.

repos = ['/repo1', '/repo2', '/repo3']
# launch 3 async calls:
procs = [subprocess.Popen(['svn', 'update', repo]) for repo in repos]
# wait.
for proc in procs:
    proc.wait()
# check for results:
if any(proc.returncode != 0 for proc in procs):
    print 'Something failed'
0

精彩评论

暂无评论...
验证码 换一张
取 消