开发者

Combining greps to make script to count files in folder

开发者 https://www.devze.com 2023-03-13 20:22 出处:网络
I need some help combining elements of scripts to form a read output. Basically I need to get the file name of a user for the folder structure listed below and using count the number of lines in the

I need some help combining elements of scripts to form a read output.

Basically I need to get the file name of a user for the folder structure listed below and using count the number of lines in the folder for that user with the file type *.ano

This is shown in 开发者_如何学Pythonthe extract below, to note that the location on the filename is not always the same counting from the front.

/home/user/Drive-backup/2010 Backup/2010 Account/Jan/usernameneedtogrep/user.dir/4.txt

/home/user/Drive-backup/2011 Backup/2010 Account/Jan/usernameneedtogrep/user.dir/3.ano

/home/user/Drive-backup/2010 Backup/2010 Account/Jan/usernameneedtogrep/user.dir/4.ano

awk -F/ '{print $(NF-2)}'

This will give me the username I need but I also need to know how many non blank lines they are in that users folder for file type *.ano. I have the grep below that works but I dont know how to put it all together so it can output a file that makes sense.

grep -cv '^[[:space:]]*$' *.ano | awk -F: '{ s+=$2 } END { print s }'

Example output needed

UserA   500
UserB 2
UserC 20


find /home -name '*.ano' | awk -F/ '{print $(NF-2)}' | sort | uniq -c

That ought to give you the number of "*.ano" files per user given your awk is correct. I often use sort/uniq -c to count the number of instances of a string, in this case username, as opposed to 'wc -l' only counting input lines.

Enjoy.


Have a look at wc (word count).


To count the number of *.ano files in a directory you can use

find "$dir" -iname '*.ano' | wc -l

If you want to do that for all directories in some directory, you can just use a for loop:

for dir in * ; do
    echo "user $dir"
    find "$dir" -iname '*.ano' | wc -l
done


Execute the bash-script below from folder

/home/user/Drive-backup/2010 Backup/2010 Account/Jan

and it will report the number of non-blank lines per user.

#!/bin/bash

#save where we start
base=$(pwd)
# get all top-level dirs, skip '.'
D=$(find . \( -type d ! -name . -prune \))

for d in $D; do
    cd $base
    cd $d
    # search for all files named *.ano and count blank lines
    sum=$(find . -type f -name *.ano -exec grep -cv '^[[:space:]]*$' {} \; | awk '{sum+=$0}END{print sum}')
    echo $d $sum
done


This might be what you want (untested): requires bash version 4 for associative arrays

declare -A count
cd /home/user/Drive-backup
for userdir in */*/*/*; do
    username=${userdir##*/}
    lines=$(grep -cv '^[[:space:]]$' $userdir/user.dir/*.ano | awk '{sum += $2} END {print sum}')
    (( count[$username] += lines ))
done

for user in "${!count[@]}"; do
    echo $user ${count[$user]}
done


Here's yet another way of doing it (on Mac OS X 10.6):

find -x "$PWD" -type f -iname "*.ano" -exec bash -c '
  ar=( "${@%/*}" )                 # perform a "dirname" command on every array item
  printf "%s\000" "${ar[@]%/*}"    # do a second "dirname" and add a null byte to every array item
' arg0 '{}' + | sort -uz | 
while IFS="" read -r -d '' userDir; do
  # to-do: customize output to get example output needed
  echo "$userDir"
  basename "$userDir"
  find -x "${userDir}" -type f -iname "*.ano" -print0 |
  xargs -0 -n 500 grep -hcv '^[[:space:]]*$' | awk '{ s+=$0 } END { print s }'
  #xargs -0 -n 500 grep -cv '^[[:space:]]*$' | awk -F: '{ s+=$NF } END { print s }'
  printf '%s\n' '----------'
done
0

精彩评论

暂无评论...
验证码 换一张
取 消