I have a datafile of 100,000 rows and 258 columns, delimited by semicolons. read.table(file, sep=";", fill=T, header=F)
reads in 60,610 rows, and read.csv2(file, header=F)
shows 100025 rows! col.names()
us开发者_运维问答ing count.fields()
makes no difference. The weirdest thing is that if I read the data into excel, save as csv, then use read.csv()
, the import is spot on. But if I change the delimiter to ","
in the original text file and try read.csv()
, it again reads in only 60,610 rows. No warnings in any of the cases. What's going on?
If you look at the code for read.csv2
(just type read.csv
and hit <enter>
at your R commandline), you'll see it does nothing but call read.table
with some default values. That should give you a hint on what is happening...
精彩评论