From: | Craig Ringer <craig(at)postnewspapers(dot)com(dot)au> |
---|---|
To: | david(dot)sahagian(at)emc(dot)com |
Cc: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: COPY TO '|gzip > /my/cool/file.gz' |
Date: | 2011-07-21 03:28:33 |
Message-ID: | 4E279CE1.2040006@postnewspapers.com.au |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On 21/07/11 01:59, david(dot)sahagian(at)emc(dot)com wrote:
> From May 31, 2006; 12:03pm . . .
>
> "It struck me that we are missing a feature that's fairly common in Unix programs.
> Perhaps COPY ought to have the ability to pipe its output to a shell command,
> or read input from a shell command. "
> Maybe something like:
> COPY mytable TO '| gzip >/home/tgl/mytable.dump.gz';
>
> Is such a feature (ie being able to tell postgres to write a compressed file via COPY TO) being worked on ?
Not that I've heard of.
In addition to the hint given about using copy to stdout from a "psql
-c" invocation, there is another option. You can create a named pipe
(fifo) file node and use COPY TO to write to it. eg:
$ mkfifo gzfifo; gzip < gzfifo > out.gz &
$ psql -c "COPY tablename TO '/server/path/to/gzfifo';"
gzip will automatically terminate when the output file is closed. The
fifo will not be removed and can be re-used.
Supporting COPY to a pipe would be interesting, though the security
implications would need plenty of thought.
--
Craig Ringer
From | Date | Subject | |
---|---|---|---|
Next Message | Craig Ringer | 2011-07-21 03:30:25 | Re: Book |
Previous Message | Craig Ringer | 2011-07-21 03:16:08 | Re: compile postgres with visual studio 2010 |