From: | "Dann Corbit" <DCorbit(at)connx(dot)com> |
---|---|
To: | <pgsql-general(at)postgresql(dot)org> |
Subject: | Re: mass import to table with unique index |
Date: | 2003-01-30 07:16:25 |
Message-ID: | D90A5A6C612A39408103E6ECDD77B8294CD8A3@voyager.corporate.connx.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
> -----Original Message-----
> From: Shridhar Daithankar
> [mailto:shridhar_daithankar(at)persistent(dot)co(dot)in]
> Sent: Wednesday, January 29, 2003 11:13 PM
> To: pgsql-general(at)postgresql(dot)org
> Subject: Re: [GENERAL] mass import to table with unique index
>
>
> On 29 Jan 2003 at 15:00, John Smith wrote:
>
> >
> > Is there a way to mass import (like COPY, INSERT INTO ...
> SELECT ...)
> > data into
> > an existing table with existing data that has a unique index?
> > Such as importing data with SSNs, and there's a unique
> index on the SSN column.
> > MySQL has an 'IGNORE' option for mass imports. Any way with
> PostgreSQL? Or only
> > with an INSERT command for each record?
>
> I don't understand. Why wouldn't copy work in this case? It
> does insert only
> and it does check index, if I am not making a mistake.
>
> I am not sure you want the contraint in place while it is
> mass importing. You
> can always drop the index, mass import data and recreate
> index if you are sure
> what you are doing..
I think that what the OP is looking for is the SQL*Server equivalent of
option IGNORE_DUP_KEY, where if you try to insert a record with that key
already included, it simply ignores that record. Hence if you have a
batch of 100 identical records, a single record gets inserted.
It's useful for things like creating dictionaries from a large list of
words.
From | Date | Subject | |
---|---|---|---|
Next Message | Lincoln Yeoh | 2003-01-30 07:17:05 | Re: Perl DBI and placeheld values |
Previous Message | Shridhar Daithankar | 2003-01-30 07:12:57 | Re: mass import to table with unique index |