开发者

Python logging redirecting stdout from multiple processes

开发者 https://www.devze.com 2023-01-28 18:19 出处:网络
I am trying to capture the stderr and stdout of a number of processes and write their outputs to a log file using the python logging module. The code below seems to acheive this. Presently I poll each

I am trying to capture the stderr and stdout of a number of processes and write their outputs to a log file using the python logging module. The code below seems to acheive this. Presently I poll each processes stdout and write to the logger if there is any data. Is there a better way of doing this.

Also I would also like to have a master log of all individual processese activity, in other words I want to automatically (without polling) write all the stdout/stderr for each process to a mas开发者_如何转开发ter logger. Is this possible?

Thanks

class MyProcess:
def __init__(self, process_name , param):
    self.param = param
    self.logfile = logs_dir + "Display_" + str(param) + ".log"
    self.args = [process_name, str(param)]
    self.logger_name = process_name + str(param)
    self.start()
    self.logger = self.initLogger()

def start(self):
    self.process = Popen(self.args, bufsize=1, stdout=PIPE, stderr=STDOUT) #line buffered
    # make each processes stdout non-blocking
    fd = self.process.stdout
    fl = fcntl.fcntl(fd, fcntl.F_GETFL)
    fcntl.fcntl(fd, fcntl.F_SETFL, fl | os.O_NONBLOCK)

def initLogger(self):
    f  = logging.Formatter("%(levelname)s -%(name)s - %(asctime)s - %(message)s")
    fh = logging.handlers.RotatingFileHandler(self.logfile, maxBytes=max_log_file_size, backupCount = 10)
    fh.setFormatter(f)

    logger = logging.getLogger(self.logger_name)
    logger.setLevel(logging.DEBUG)
    logger.addHandler(fh) #file handler
    return logger

def getOutput(self): #non blocking read of stdout
    try:
        return self.process.stdout.readline()
    except:
        pass

def writeLog(self):
    line = self.getOutput()
    if line:
        self.logger.debug(line.strip()) 
        #print line.strip()



process_name = 'my_prog'
num_processes = 10
processes=[]

for param in range(num_processes)
    processes.append(MyProcess(process_name,param))

while(1):
    for p in processes:
        p.writeLog()

    sleep(0.001)


Your options here are

  • Non-blocking I/O: This is what you have done :)

  • The select module: You can use either poll() or select() to dispatch reads for the different inputs.

  • Threads: Create a thread for each file descriptor you want to monitor and use blocking I/O. Not advisable for large numbers of file descriptors, but at least it works on Windows.

  • Third-party libraries: Apparently, you can also use Twisted or pyevent for asynchronous file access, but I never did that...

For more information, watch this video on non-blocking I/O with Python

Since your approach seems to work, I would just stick to it, if the imposed processor load does not bother you. If it does, I would go for select.select() on Unix.

As for your question about the master logger: Because you want to tee off the individual outputs, you can't redirect everything to a master logger. You have to do this manually.

0

精彩评论

暂无评论...
验证码 换一张
取 消