We have around 250 identical linux server which runs a business critical web application for a bank. Basically we do a lot of scripting work but now i want to centralize that only in one location. That means run on one server and and deploy it in many. I know you guys must be thinking that this is an easy task and can be done with a shell script. But again we 开发者_运维问答need to create many different different scripts to do our work
I know python has a big library and this can be possible but i dont know how. To cut in short i need all scripts in one file and based on the argument it will execute it according.
For example in a python program we have a function where we can mix them to perform different result.
So you please let me know how to go about it
This is a very general question, so I'll respond with two different frameworks that are made using Python to facilitate bulk system administration tasks.
func - Func is part of the Fedora project and so is specialized to their architecture. If your hosts are all RedHat/CentOS based, this is the solution for you.
fabric - Fabric is more generic but does generally the same thing at a high level. If your environment is heterogenous (full of different types of systems/distributions), this is probably what you want.
You could also try any of the distributed computing packages. Pyro is one of them that might interest you.
精彩评论