Re: psql vs perl prepared inserts

From: "Daniel Verite" <daniel(at)manitou-mail(dot)org>
To: pgsql-general(at)postgresql(dot)org
Subject: Re: psql vs perl prepared inserts
Date: 2005-04-13 14:52:42
Message-ID: 20050413165239.6368792@uruguay
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

Neil Conway wrote:

> For an INSERT query without any sub-selects
> that is not rewritten by any rules, the cost to parse, rewrite and plan
> the statement is trivial. So I wouldn't expect prepared statements to be
> a big win -- you would gain a lot more from batching multiple inserts
> into a single transaction, and more still from using COPY.

FWIW, when testing pgstream [1] I typically see a 50% increase in execution
speed when switching to prepared statements in such a scenario.
I'm attaching a small test program that inserts 10000 rows into 5 columns, first
without and then with prepared statements, and displays elapsed time.

Example of results:
elapsed time in loop 0 is 1873 ms (PQexec)
elapsed time in loop 1 is 1136 ms (PQexecPrepared)
That's with unix domain sockets and a 8.0.1 server.

[1] a thin C++ layer on top of libpq (http://manitou-mail.org/pgstream) that
happens to have a unified API for prepared/non-prepared statements.

--
Daniel
PostgreSQL-powered mail user agent and storage: http://www.manitou-mail.org

Attachment Content-Type Size
tstspeed.cpp text/plain 1.1 KB

In response to

Browse pgsql-general by date

  From Date Subject
Next Message Tom Lane 2005-04-13 15:11:42 Re: psql vs perl prepared inserts
Previous Message James Robinson 2005-04-13 14:50:16 Re: Composite type versus Domain constraints.