From: | david(at)lang(dot)hm |
---|---|
To: | Stephen Frost <sfrost(at)snowman(dot)net> |
Cc: | James Mansion <james(at)mansionfamily(dot)plus(dot)com>, pgsql-performance(at)postgresql(dot)org |
Subject: | Re: performance for high-volume log insertion |
Date: | 2009-04-22 00:12:26 |
Message-ID: | alpine.DEB.1.10.0904211709030.12662@asgard.lang.hm |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-performance |
On Tue, 21 Apr 2009, Stephen Frost wrote:
> * James Mansion (james(at)mansionfamily(dot)plus(dot)com) wrote:
>> david(at)lang(dot)hm wrote:
>>> on the other hand, when you have a full queue (lots of stuff to
>>> insert) is when you need the performance the most. if it's enough of a
>>> win on the database side, it could be worth more effort on the
>>> applicaiton side.
>> Are you sure preparing a simple insert is really worthwhile?
>>
>> I'd check if I were you. It shouldn't take long to plan.
>
> Using prepared queries, at least if you use PQexecPrepared or
> PQexecParams, also reduces the work required on the client to build the
> whole string, and the parsing overhead on the database side to pull it
> apart again. That's where the performance is going to be improved by
> going that route, not so much in eliminating the planning.
in a recent thread about prepared statements, where it was identified that
since the planning took place at the time of the prepare you sometimes
have worse plans than for non-prepared statements, a proposal was made to
have a 'pre-parsed, but not pre-planned' version of a prepared statement.
This was dismissed as a waste of time (IIRC by Tom L) as the parsing time
was negligable.
was that just because it was a more complex query to plan?
David Lang
From | Date | Subject | |
---|---|---|---|
Next Message | Robert Haas | 2009-04-22 02:29:16 | Re: performance for high-volume log insertion |
Previous Message | Greg Smith | 2009-04-22 00:01:10 | Re: performance for high-volume log insertion |