From: | Dmitry Tkach <dmitry(at)openratings(dot)com> |
---|---|
To: | jerome <jerome(at)gmanmi(dot)tv> |
Subject: | Re: URGENT: pg_dump error |
Date: | 2003-02-11 18:59:56 |
Message-ID: | 3E49482C.6050002@openratings.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
I suspect, your problem is that the output file is too large (if you are on ext2, you cannot have files larger than 2 gig in the filesystem).
Try this:
pg_dump mydatabase -t mytable | gzip -f > sample.gz
or
pg_dump mydatabase -t mytable | split -C 2000m - sample.
or even
pg_dump mydatabase -t mytable | gzip -f | split -b 2000m - sample.gz.
...
The first case should work, unless even the compressed file is larger than 2 gig, either of the other two will work regardless of the output size
(as long as it fits on your disk of course).
In the two last cases, it will create several files, called like sample.aa, sample,ab... or sample.gz.aa, sample.gz.bb etc...
To 'reassemble' them later, you'll need something like:
cat sample.* | psql mydatabase #for the first case - no gzip or
cat sample.gz.* | gunzip -f | psql mydatabase
I hope, it helps...
Dima
jerome wrote:
> i tried to do pg_dump
>
> pg_dump mydatabase -t mytable > sample
>
> it always results to=20
>
> KILLED
>
> can anyone tell me what should i do...
>
> TIA
>
> ---------------------------(end of broadcast)---------------------------
> TIP 6: Have you searched our list archives?
>
> http://archives.postgresql.org
From | Date | Subject | |
---|---|---|---|
Next Message | Greg Stark | 2003-02-11 19:05:09 | Re: Is Hash Agg being used? 7.4 seems to handle this query worse than 7.3 |
Previous Message | John Li | 2003-02-11 18:15:35 | Compile perl program with Pg |