| From: | Bruno Wolff III <bruno(at)wolff(dot)to> |
|---|---|
| To: | Bricklen <bricklen-rem(at)yahoo(dot)comz> |
| Cc: | pgsql-general(at)postgresql(dot)org |
| Subject: | Re: COPY error handling |
| Date: | 2004-06-07 04:11:36 |
| Message-ID: | 20040607041136.GB17952@wolff.to |
| Views: | Whole Thread | Raw Message | Download mbox | Resend email |
| Thread: | |
| Lists: | pgsql-general |
On Fri, Jun 04, 2004 at 14:11:19 +0000,
Bricklen <bricklen-rem(at)yahoo(dot)comz> wrote:
> Hi, I'm not sure if this is the correct group for this question, but
> I'll post it hoping that it is.
> I'm loading several ~15 million row files into a table using the COPY
> command. Apparently one of the rows, about 6 million in, has an invalid
> entry. This is causing the COPY command to fail, so my question is this:
> Is there any way to skip invalid rows? Or send them to a separate log
> file etc to go through later?
> I've gone through the docs, but I didn't see anything specific to this.
> Any information, links, or hints are greatly appreciated.
Currently there isn't a builtin way to do this. You can pass the data
through a filter script that removes rows that are not in a proper format.
| From | Date | Subject | |
|---|---|---|---|
| Next Message | Bruno Wolff III | 2004-06-07 04:17:58 | Re: Updating a unique constrant |
| Previous Message | felix-lists | 2004-06-07 01:12:13 | Re: Dropping schemas and "illegal seek" -- MEA CUPLA |