开发者

Export to CSV and Compress with GZIP in postgres

开发者 https://www.devze.com 2023-01-20 21:20 出处:网络
I need to export a big table to csv file and compress it. I can export it using COPY command from postgres like -

I need to export a big table to csv file and compress it.

I can export it using COPY command from postgres like -

COPY foo_table to '/tmp开发者_JS百科/foo_table.csv' delimiters',' CSV HEADER;

And then can compress it using gzip like -

gzip -c foo_table.csv > foo.gz

The problem with this approach is, I need to create this intermediate csv file, which itself is huge, before I get my final compressed file.

Is there a way of export table in csv and compressing the file in one step?

Regards, Sujit


The trick is to make COPY send its output to stdout, then pipe the output through gzip:

psql -c "COPY foo_table TO stdout DELIMITER ',' CSV HEADER" \
    | gzip > foo_table.csv.gz


You can use directly, as per docs, https://www.postgresql.org/docs/9.4/sql-copy.html

COPY foo_table to PROGRAM 'gzip > /tmp/foo_table.csv' delimiter ',' CSV HEADER;


Expanding a bit on @Joey's answer, below adds support for a couple more features available in the manual.

psql -c "COPY \"Foo_table\" (column1, column2) TO stdout DELIMITER ',' CSV HEADER" \
    | gzip > foo_table.csv.gz

If you have capital letters in your table name (woe be onto you), you need the \" before and after the table name.

The second thing I've added is column listing.

Also note from the docs:

This operation is not as efficient as the SQL COPY command because all data must pass through the client/server connection. For large amounts of data the SQL command might be preferable.


PostgreSQL 13.4

psql command \copy also works combined with SELECT column_1, column_2, ... and timestamp date +"%Y-%m-%d_%H%M%S" for filename dump.

\copy (SELECT id, column_1, column_2, ... FROM foo_table) \ 
TO PROGRAM 'gzip > ~/Downloads/foo_table_dump_`date +"%Y-%m-%d_%H%M%S"`.csv.gz' \
DELIMITER ',' CSV HEADER ;
0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号