I have a bash backup script run as root (cron) that delegates certain tasks to other specific bash scripts owned by different users. (simplified example, principle is, some things have to be done as root, different tasks are delegated to users with the appropriate environment (oracle, amazon, ...)
mkdir -p /tmp/backup$NAME
su - oracle -c "~/.backups/export-test.sh"
tar cf /tmp/backup/$NOW.tar /tmp/backup$NAME
su - amazon upload_to_amazon.sh /tmp/backup/$NOW.tar
This script itself does then some tasks as user oracle:
mkdir -p $TMP_LOCATION
cd ~/.backups
exp $TMP_LOCATION/$NAME-$NOW
When I try to mimic this behaviour in python I came up with the following (started from cron as root)
name = "oracle"
# part run as root
os.makedirs(tmp_backup + name)
os.setegid(pwd.getpwnam(name)[3])
os.seteuid(pwd.getpwnam(name)[2])
# part run as oracle
os.makedirs(tmp_location)
os.chdir(os.path.expanduser("~{user}/.backups".format(user=name)))
subprocess.check_call(["exp",
"os.path.join(tmp_location, name+'-'+now)"
])
In bash when using su -, a real new shell is invoked and all environment variables of that user are set. How can I improve this for my python script? Is there a stan开发者_Go百科dard recipe I can follow? I'm thinking of environment variables, umask, ...
the environment is Solaris if that might matter.
all environment variables of that user are set
Usually because a shell runs a .profile
file when it starts up.
You have several choices.
Create a proper subprocess with
subprocess.Popen
to execute the shell.profile
-- same assu -
.Carefully locate the environment variable settings and mimic them in Python. The issue is that a
.profile
can do all kinds of crazy things, making it a potential problem to determine the exact effects of the.profile
.Or you can extract the relevant environment variables to make the accessible to both the shell environment and your Python programs.
First. Read the .profile
for each user to be clear on what environment variables it sets (different from things like aliases or other craziness that doesn't apply to your Python script). Some of these environment variables are relevant to the scripts you're running. Some aren't relevant.
Second. Split the "relevant" environment variables into a tidy env_backups.sh
script or env_uploads.sh
script.
Once you have those environment variable scripts, update your .profile
files to replace the environment variables settings with source env_backup.sh
or source env_uploads.sh
.
Third. Source the relevant env_this
and env_that
scripts before running the Python program. Now your Python environment shares the variables with your shell environment and you only maintain them in one place.
my_script.sh
source ~oracle/env_backup.sh
source ~amazon/env_uploads.sh
python my_script.py
That seems best to me. (Since that's how we do it.)
I can run amazon as root, without needing environment variables after all. I used boto for that.
As for the oracle environment variables I used this piece of code:
if "ORACLE_HOME" not in os.environ or os.environ["ORACLE_HOME"] != ORACLE_HOME:
logger.debug("setting ORACLE_HOME='{oh}'".format(oh=ORACLE_HOME))
os.environ['ORACLE_HOME'] = ORACLE_HOME
if ORACLE_HOME + "/bin" not in os.environ["PATH"].split(":"):
logger.debug("setting PATH='{p}'".format(p=os.path.expandvars(ORACLE_PATH)))
os.environ['PATH'] = os.path.expandvars(ORACLE_PATH)
if "NLS_LANG" not in os.environ or os.environ["NLS_LANG"] != NLS_LANG:
logger.debug("setting NLS_LANG='{n}'".format(n=NLS_LANG))
os.environ['NLS_LANG'] = NLS_LANG
精彩评论