From: | papapep <papapep(at)gmx(dot)net> |
---|---|
To: | pgsql-novice <pgsql-novice(at)postgresql(dot)org> |
Subject: | Filtering duplicated row with a trigger |
Date: | 2003-10-06 16:30:29 |
Message-ID: | 3F8198A5.9040905@gmx.net |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-novice |
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
I've got plenty of data files (prepared to be inserted with the \copy
statement) but I have to filter them to be sure that there are no
duplicated rows inserted.
I know I should do it with a trigger that executes a function before
inserting the row and if it's duplicated do something with it (insert it
in another table, simply forget it, etc...). The theory is clear :-)
But the practice is not so clear (for me, of course).
Anyone can give me some guide to how the function should do the control
of duplicated rows?
Thanks.
Josep Sànchez
[papapep]
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.1 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iD8DBQE/gZik2vx52x0kyz4RApbZAKCrhvCywbNH8Zce0xpfDhMNQBfQ+ACfShG6
96nY7di8KnV8gJrcWIOzqLI=
=32il
-----END PGP SIGNATURE-----
From | Date | Subject | |
---|---|---|---|
Next Message | Nabil Sayegh | 2003-10-06 16:43:31 | Re: Filtering duplicated row with a trigger |
Previous Message | Menke, Kurt | 2003-10-06 14:49:26 | Re: importing data automatically |