From: | Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us> |
---|---|
To: | pgsql-sql(at)hub(dot)org |
Subject: | Re: question about pg_dump |
Date: | 1999-03-03 15:25:31 |
Message-ID: | 25093.920474731@sss.pgh.pa.us |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-sql |
Bill Brandt <brandtwr-pgsql(at)draaw(dot)net> writes:
> I'm trying to upgrade from 6.3. In doing this, I performed a pg_dump of
> each database (pg_dump -o where the oid's matter). I have one database
> (which doesn't use oid info in the tables btw) that is around 40MB of
> data. The output file that created looks fine, but when I go to load the
> data, it gets a portion of the way into loading and indicates that you
> cannot exceed 20000 lines of input on a copy command. Is there a way to
> dump the database that it won't try more than 20000 lines at a time?
You can just split the command into sections by editing the dump file
(assuming you have an editor that will cope ;-)):
COPY table FROM stdin;
blah blah
blah blah
\.
COPY table FROM stdin;
more more
more more
\.
COPY is really APPEND, it doesn't delete any tuples that were there,
so this should work.
Still, a 20K-line limit sounds awfully brain-dead to me. And a quick
look at the source code (both 6.4.2 and current CVS) doesn't show any
such check unless I missed something. What version are you using?
What's the exact text of the error message?
regards, tom lane
From | Date | Subject | |
---|---|---|---|
Next Message | Mario Jorge Nunes Filipe | 1999-03-03 17:21:03 | A dillema ... |
Previous Message | Jan Wieck | 1999-03-03 15:25:17 | Re: [SQL] Trees in SQL (again) |