Re: High memory usage / performance issue ( temp tables ? )

From: gmb <gmbouwer(at)gmail(dot)com>
To: pgsql-sql(at)postgresql(dot)org
Subject: Re: High memory usage / performance issue ( temp tables ? )
Date: 2014-08-17 10:40:21
Message-ID: 1408272021571-5815111.post@n5.nabble.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-sql

>> Are you using the same temp tables for the whole batch or do you generate
a few 100K
>> of them ?

The process re-creates the 10 temp table for each instance of the function
being called.
I.e. this will equate to 500k temp tables for 50k xml files.
The "ON COMMIT DROP" part was added at some stage as an attempt to solve
some performance issues. THe argument was that , since a COMMIT is done
after each of the 50k xml files , the number of temp tables will not build
up and cause any problems.

I can understand the performance issue due to load on the catalog, but I
would not have expected this to have the impact I'm experiencing.

>> It may help to call analyze explicitly on the touched tables
>> a few times during your process. Here a look at the monitoring statistics
>> may give some clue.
>> (http://blog.pgaddict.com/posts/the-two-kinds-of-stats-in-postgresql)

Thanks, I'll try this and see of this makes any difference.

THanks for the input.

Regards

gmb

--
View this message in context: http://postgresql.1045698.n5.nabble.com/High-memory-usage-performance-issue-temp-tables-tp5815108p5815111.html
Sent from the PostgreSQL - sql mailing list archive at Nabble.com.

In response to

Responses

Browse pgsql-sql by date

  From Date Subject
Next Message gmb 2014-08-17 19:47:26 Re: High memory usage / performance issue ( temp tables ? )
Previous Message Marc Mamin 2014-08-17 08:50:29 Re: High memory usage / performance issue ( temp tables ? )