Re: Finding Duplicate Rows during INSERTs

From: Rob Sargent <robjsargent(at)gmail(dot)com>
To: pgsql-general(at)postgresql(dot)org
Subject: Re: Finding Duplicate Rows during INSERTs
Date: 2012-07-10 00:01:06
Message-ID: 4FFB70C2.9070207@gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

On 07/09/2012 04:48 PM, Rich Shepard wrote:
> Source data has duplicates. I have a file that creates the table then
> INSERTS INTO the table all the rows. When I see errors flash by during the
> 'psql -d <database> -f <file.sql>' I try to scroll back in the terminal to
> see where the duplicate rows are located. Too often they are too far
> back to
> let me scroll to see them.
>
> There must be a better way of doing this. Can I run psql with the tee
> command to capture errors in a file I can examine? What is the proper/most
> efficient way to identify the duplicates so they can be removed?
>
> TIA,
>
> Rich
>
>

psql -d <database> -f file.sql > file.log 2>&1 would give you a logfile

sort -u file.raw > file.uniq might give you clean data?

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Don Parris 2012-07-10 00:20:15 Re: Tutorial On Connecting LibreOffice to PostgreSQL Available
Previous Message Rich Shepard 2012-07-09 22:48:47 Finding Duplicate Rows during INSERTs