I basically have 3 tables. One being the core table and the other 2 depend
on the 1st. I have the requirement to add upto 70000 records in the tables.
I do have constraints (primary & foreign keys, index, unique etc) set for
the tables. I can't go for bulk import (using COPY command) as there is no
standard .csv file in requirement, and the mapping is explicitly required
plus few validations are externally applied in a C based programming file.
Each record details (upto 70000) will be passed from .pgc (an ECPG based C
Programming file) to postgresql file. It takes less time for the 1st few
records and the performance is turning bad to the latter records! The
result is very sad that it takes days to cover upto 20000! What are the
performance measures could I step in into this? Please guide me