I tried to play around with .import but it seems to limited with csv and delimi开发者_JAVA技巧ted file. Is it possible to import gzip file ? or at least, pipe from command line ?
Also, could I skip some un-wanted column like mysql "LOAD DATA INFILE" ?
If you don't want to use named pipes, you could also:
zcat $YOURFILE.gz | sqlite3 $YOURDB.sqlite ".import /dev/stdin $TABLENAME"
If you need to modify stuff before import, you could use perl
(or awk
, sed
, whatever) between the zcat and sqlite commands.
For example, if your file already uses the pipe character as a delimiter and you would like to import only columns 0 to 3 and 5 to 6:
zcat $YOURFILE.gz | perl -F'\|' -anle 'print join("|", @F[0..3,5..6])' | sqlite3 $YOURDB.sqlite ".import /dev/stdin $TABLENAME"
$ mkfifo tempfile
$ zcat my_records.csv.gz > tempfile
This works like magic!
Although the mkfifo
does create temporary file, the size of this file is 0 byte.
When running this command $ zcat my_records.csv.gz > tempfile
, it will halt at the command prompt.
This allows you to run
sqlite3> .import tempfile db_table
After sqlite3
finished importing the named pipe, zcat
command will also finish running. You can then remove the named pipe.
$ rm -f tempfile
zcat data.gz |\
cat <(echo -e ".separator ','\n.import /dev/stdin dest_table") - |\
sqlite3 db.sqlite
works nicely (linux).
You can create a named pipe. It will act like a normal file but decompress on the fly. SQLite will know nothing about it.
It turns out the example on wikipedia is with gzip. http://en.wikipedia.org/wiki/Named_pipe
You could write a parser for data that would convert it to a series of SQL statements. Perl is a good language for that. It can even handle gzip'd files.
Are you running this in a *Nix OS? If so, you could create a temporary file to hold the decompressed data:
tf="$(mktemp)" &&
zcat <my_records.csv.gz >"$tf"
sqlite3 /path/to/database.sqlite3 ".import $tf"
rm -f "$tf"
精彩评论