I want to grep the error's out of a log file and save the value as an error. When I use:
errors = os.system("cat log.txt | grep 'ERROR' | wc -l")
I get the return code that the command worked or not. When I use:
errors = os.popen("cat log.txt | grep 'ERROR' | wc -l")
I get what the command is trying to do.
When I run this in the command line I 开发者_StackOverflow社区get 3 as thats how many errors there are.
Can anyone suggest another way in Python that will allow me to save the value of this bash command?
Thanks
popen
is deprecated. Use subprocess instead. For example, in your case:
p1 = Popen(["cat", "log.txt"], stdout=PIPE)
p2 = Popen(["grep", "ERROR"], stdin=p1.stdout, stdout=PIPE)
output = p2.communicate()[0]
First open a pipe using popen as you did.
p = os.popen("cat log.txt | grep 'ERROR' | wc -l")
Now just access the pipe like a normal file:
output = p.readline()
This will be a string so you'll still have to do some additional parsing, but that shouldn't be a problem.
EDIT: Ok, it seems that from Python 2.6 onwards, os.popen is deprecated. I thus defer my answer to whoever answered correctly using subprocess.Popen instead. Thanks for that guys.
You're probably looking for:
grep -c 'ERROR' log.txt
Generally for spawning a subprocess you need to use subprocess
module. There are plenty example, I'm sure you wouldn't get lost.
How many 'ERROR'
in the file:
nerrors = open('log.txt').read().count('ERROR') # put whole file in memory
How many lines that contain 'ERROR'
:
nerrors = sum(1 for line in open('log.txt') if 'ERROR' in line) # line at a time
If you must use the literal bash line then in Python 2.7+:
from subprocess import check_output as qx
nerrors = int(qx("cat your_file.txt | grep 'ERROR' | wc -l", shell=True))
See Capturing system command output as a string for an implementation of check_output()
for Python < 2.7.
精彩评论