COPY table_name FROM PROGRAM 'unzip -p input.csv.zip' DELIMITER ',';
From the man page for unzip -p :
-p extract files to pipe (stdout). Nothing but the file data is sent to stdout, and the files are always extracted in binary
format, just as they are stored (no conversions).
This might only work when loading redshift from S3, but you can actually just include a "gzip" flag when copying data to redshift tables, as described here : This is the format that works for me if my s3 bucket contains a gzipped .csv.
copy <table> from 's3://mybucket/<foldername> '<aws-auth-args>' delimiter ',' gzip;
4条答案
按热度按时间lp0sw83n1#
From within Postgres:
From the man page for
unzip -p
:dxxyhpgq2#
你能不能做点
unzip -c myfile.zip | gzip myfile.gz
如果您有足够的文件,则可以轻松实现自动化。
ebdffaop3#
This might only work when loading redshift from S3, but you can actually just include a "gzip" flag when copying data to redshift tables, as described here :
This is the format that works for me if my s3 bucket contains a gzipped .csv.
hsgswve44#
unzip -c /path/to/.zip | psql -U user
“用户”必须具有超级用户权限,否则您将获得
要了解有关此功能的详细信息,请参阅
https://www.postgresql.org/docs/8.0/static/backup.html
基本上,此命令用于处理大型数据库