From: | Mridula Mahadevan <mmahadevan(at)stratify(dot)com> |
---|---|
To: | Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us> |
Cc: | "pgsql-performance(at)postgresql(dot)org" <pgsql-performance(at)postgresql(dot)org> |
Subject: | Re: Queries within a function |
Date: | 2010-02-02 20:53:56 |
Message-ID: | 0A59BA5B590B7E4A8D441196A9F17E904C68825D30@corpmail11.calpurnia.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-performance |
Tom,
I cannot run execute because all these are temp tables with drop on auto commit within a function. This should have something to do with running it in a transaction, when I run them in autocommit mode (without a drop on autocommit) the queries return in a few seconds.
-----Original Message-----
From: Tom Lane [mailto:tgl(at)sss(dot)pgh(dot)pa(dot)us]
Sent: Tuesday, February 02, 2010 11:28 AM
To: Mridula Mahadevan
Cc: pgsql-performance(at)postgresql(dot)org
Subject: Re: [PERFORM] Queries within a function
Mridula Mahadevan <mmahadevan(at)stratify(dot)com> writes:
> I am running a bunch of queries within a function, creating some temp tables and populating them. When the data exceeds say, 100k the queries start getting really slow and timeout (30 min). when these are run outside of a transaction(in auto commit mode), they run in a few seconds. Any ideas on what may be going on and any postgresql.conf parameters etc that might help?
I'll bet the function is caching query plans that stop being appropriate
once the table grows in size. You might have to resort to using
EXECUTE, although if you're on 8.4 DISCARD PLANS ought to help too.
regards, tom lane
From | Date | Subject | |
---|---|---|---|
Next Message | Rob | 2010-02-02 21:44:36 | Re: System overload / context switching / oom, 8.3 |
Previous Message | Andy Colson | 2010-02-02 20:47:50 | Re: System overload / context switching / oom, 8.3 |