Re: prevent duplicate entries

From: David G Johnston <david(dot)g(dot)johnston(at)gmail(dot)com>
To: pgsql-novice(at)postgresql(dot)org
Subject: Re: prevent duplicate entries
Date: 2014-05-29 14:03:39
Message-ID: 1401372219275-5805419.post@n5.nabble.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-novice

amulsul wrote
> On Thursday, 29 May 2014 3:20 PM, Thomas Drebert &lt;

> drebert@

> &gt; wrote:
>
>
>>Has postgresql a separate function to prevent duplicate records?
>
>>At time i filter records in php.
>
> you can directly load csv file date on postgres database using
> pg_bulkload, which has functionality to avoid duplication 
>
> pg_bulkload : http://pgbulkload.projects.pgfoundry.org/pg_bulkload.html
>
> Is this answer to your question?
>
> Regards,
> Amul Sul

You might find it better to just load the CSV data into a staging table then
perform the necessary "INSERT INTO live ... SELECT ... FROM staging" query
to migrate only the new data.

It likely will not make much sense to accept (say 90%) of your data eating
resources generating duplicate key errors.

David J.

--
View this message in context: http://postgresql.1045698.n5.nabble.com/prevent-duplicate-entries-tp5805373p5805419.html
Sent from the PostgreSQL - novice mailing list archive at Nabble.com.

In response to

Browse pgsql-novice by date

  From Date Subject
Next Message lmanorders 2014-06-02 23:52:02 INSERT INTO FROM SELECT
Previous Message amul sul 2014-05-29 10:08:39 Re: prevent duplicate entries