| From: | Bill Brandt <brandtwr-pgsql(at)draaw(dot)net> |
|---|---|
| To: | pgsql-sql(at)hub(dot)org |
| Subject: | question about pg_dump |
| Date: | 1999-03-03 03:01:56 |
| Message-ID: | 19990302220156.A30158@draaw.net |
| Views: | Whole Thread | Raw Message | Download mbox | Resend email |
| Thread: | |
| Lists: | pgsql-sql |
I'm trying to upgrade from 6.3. In doing this, I performed a pg_dump of
each database (pg_dump -o where the oid's matter). I have one database
(which doesn't use oid info in the tables btw) that is around 40MB of
data. The output file that created looks fine, but when I go to load the
data, it gets a portion of the way into loading and indicates that you
cannot exceed 20000 lines of input on a copy command. Is there a way to
dump the database that it won't try more than 20000 lines at a time? I
kept the 6.3 directory so dumping again is not a big deal.
Bill
--
Bill Brandt
brandtwr(at)draaw(dot)net http://www.draaw.net/
| From | Date | Subject | |
|---|---|---|---|
| Next Message | MAILER-DAEMON | 1999-03-03 04:44:58 | Undeliverable Message |
| Previous Message | Pascal Mueller | 1999-03-02 22:06:55 | RULE problem |