I have several large files, each of which I want to chunk/split it in to predefined number of parts.
Is there an efficient way to do it in 开发者_JAVA技巧Unix (e.g. via awk/sed/perl)?
Also each file can have varied number of lines.
File1.txt 20,300,055 lines
File2.txt 10,033,221 lines
etc...
If you just want to split each file into files of a fixed number of lines or bytes, you can use the split
command.
I found this. You may need to find the number of parts to be splited it too first.
you can use csplit, which can split by context. Check the man/info page of csplit for more info.
精彩评论