Re: Finding Duplicate Rows during INSERTs

From: Darren Duncan <darren(at)darrenduncan(dot)net>
To: Rich Shepard <rshepard(at)appl-ecosys(dot)com>
Cc: pgsql-general(at)postgresql(dot)org
Subject: Re: Finding Duplicate Rows during INSERTs
Date: 2012-07-10 02:54:23
Message-ID: 4FFB995F.5080007@darrenduncan.net
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

Rich Shepard wrote:
> Source data has duplicates. I have a file that creates the table then
> INSERTS INTO the table all the rows. When I see errors flash by during the
> 'psql -d <database> -f <file.sql>' I try to scroll back in the terminal to
> see where the duplicate rows are located. Too often they are too far
> back to
> let me scroll to see them.
>
> There must be a better way of doing this. Can I run psql with the tee
> command to capture errors in a file I can examine? What is the proper/most
> efficient way to identify the duplicates so they can be removed?
>
> TIA,
>
> Rich

What I recommend is instead inserting your data into staging tables which lack
key constraints, and then you can use SQL to then either locate duplicates or
just copy the unique rows to the normal tables. I mean, ostensibly SQL is a
better tool for cleaning data than anything else right, usually, or reporting.
-- Darren Duncan

In response to

Browse pgsql-general by date

  From Date Subject
Next Message Stefan Schwarzer 2012-07-10 05:31:53 Re: ERROR: function crosstab(unknown, unknown) does not exist
Previous Message Rich Shepard 2012-07-10 01:03:15 Re: Finding Duplicate Rows during INSERTs