开发者

How to import CSV into sqlite using RSqlite?

开发者 https://www.devze.com 2023-01-28 03:53 出处:网络
As question, I found that I can use .import in sqlite sh开发者_如何学Goell, but seems it is not working in R environment, any suggestions?You can use read.csv.sql in the sqldf package.It is only one l

As question, I found that I can use .import in sqlite sh开发者_如何学Goell, but seems it is not working in R environment, any suggestions?


You can use read.csv.sql in the sqldf package. It is only one line of code to do the read. Assuming you want to create a new database, testingdb, and then read a file into it try this:

# create a test file
write.table(iris, "iris.csv", sep = ",", quote = FALSE, row.names = FALSE)

# create an empty database.
# can skip this step if database already exists.
sqldf("attach testingdb as new")
# or: cat(file = "testingdb")

# read into table called iris in the testingdb sqlite database
library(sqldf)
read.csv.sql("iris.csv", sql = "create table main.iris as select * from file", 
  dbname = "testingdb")

# look at first three lines
sqldf("select * from main.iris limit 3", dbname = "testingdb")

The above uses sqldf which uses RSQLite. You can also use RSQLite directly. See ?dbWriteTable in RSQLite. Note that there can be problems with line endings if you do it directly with dbWriteTable that sqldf will automatically handle (usually).

If your intention was to read the file into R immediately after reading it into the database and you don't really need the database after that then see:

http://code.google.com/p/sqldf/#Example_13._read.csv.sql_and_read.csv2.sql


I tend to do that with the sqldf package: Quickly reading very large tables as dataframes in R

Keep in mind that in the above example I read the csv into a temp sqlite db. You'll obviously need to change that bit.

0

精彩评论

暂无评论...
验证码 换一张
取 消