From: | Jasen Betts <jasen(at)xnet(dot)co(dot)nz> |
---|---|
To: | pgsql-novice(at)postgresql(dot)org |
Subject: | Re: how to continue after error in batch mode with psql |
Date: | 2010-04-29 09:18:58 |
Message-ID: | hrbiu2$tt4$2@reversiblemaps.ath.cx |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-novice |
On 2010-04-28, Urs Rau <urs(dot)rau(at)uk(dot)om(dot)org> wrote:
> I have a nightly process that pulls data over into postgresql 8.3.10
> from a progress server runing under8.x. unfortunately the progress db
> does not enforce record size/length limitations. It does not care if a
> record of 20 characters contains 21, or 100 for that matter.
>
> we have a simple pipe that taks the progress data export dump and
> imports it into postgresql
>
> su - postgres -c "cat test.csv | psql -X -q test -c \"COPY t_test FROM
> stdin WITH DELIMITER AS ',' NULL AS '?' CSV QUOTE AS '\\\"' ESCAPE AS
> '\\\"'\""
>
> If any of the fields are over-length, we do get a error message that
> tells us which row needs fixing.
>
> ERROR: value too long for type character varying(20)
> CONTEXT: COPY t_test, line 2, column t_comment_c: "'comment 3 that is
> too long'"
>
> But how do we get psql to run through and continue after an error and
> import as much as possible of the rest of the import data and give us
> error messages about all lines with errors?
why not just copy the whole lot into a temp table (like the target
table but with text fields instead of fixed-width varchar)
and then do a select on that to find the over-length lines and another
select to insert the good data into the real target table.
From | Date | Subject | |
---|---|---|---|
Next Message | Jasen Betts | 2010-04-29 09:21:11 | Re: Indicators |
Previous Message | Urs Rau | 2010-04-28 15:59:05 | how to continue after error in batch mode with psql |