siva palanisamy <psivait(at)gmail(dot)com> wrote:
> I basically have 3 tables. One being the core table and the other
> 2 depend on the 1st. I have the requirement to add upto 70000
> records in the tables. I do have constraints (primary & foreign
> keys, index, unique etc) set for the tables. I can't go for bulk
> import (using COPY command) as there is no standard .csv file in
> requirement, and the mapping is explicitly required plus few
> validations are externally applied in a C based programming file.
> Each record details (upto 70000) will be passed from .pgc (an ECPG
> based C Programming file) to postgresql file. It takes less time
> for the 1st few records and the performance is turning bad to the
> latter records! The result is very sad that it takes days to cover
> upto 20000! What are the performance measures could I step in into
> this? Please guide me
There's an awful lot you're not telling us, like what version of
PostgreSQL you're using, what your hardware looks like, how many
rows you're trying to insert per database transaction, what resource
looks like on the machine when it's running slow, what the specific
slow queries are and what their execution plans look like, etc. I
could make a lot of guesses and take a shot in the dark with some
generic advice, but you would be better served by the more specific
advice you will get if you provide more detail.
Please review this page (and its links) and post again:
http://wiki.postgresql.org/wiki/SlowQueryQuestions
-Kevin