I have a big compressed csv file (25gb) and I want to import it into PostgreSQL 9.5 version. Is there any fast way to import zip or qzip file into postgres without extracting the file?
There is an old trick to use a named pipe (works on Unix, don't know about Windows)
mkfifo /tmp/omyfifo
zcat mycsv.csv.z > /tmp/omyfifo &
copy mytable(col1,...) from '/tmp/omyfifo'
rm /tmp/omyfifo
The zcat
in the backgound will block until a reader (here: the COPY
command) will start reading, and it will finish at EOF. (or if the reader closes the pipe)
You could even start multiple pipes+zcat pairs, which will be picked up by multiple COPY
statements in your sql script.
This will work from pgadmin, but the fifo (+zcat process) should be present on the machine where the DBMS server runs.
BTW: a similar trick using netcat can be used to read a file from a remote machine (which of course should write the file to the network socket)