From: | Sam Mason <sam(at)samason(dot)me(dot)uk> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: Performance of update |
Date: | 2008-03-27 11:11:48 |
Message-ID: | 20080327111148.GS6870@frubble.xen.chris-lamb.co.uk |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On Wed, Mar 26, 2008 at 01:26:03PM -0700, Sam wrote:
> Iam trying to update a database table with approx 45000 rows. Iam not
> updating all rows at a time. Iam updating 60 rows at a given time for
> example. and this is happening in a FOR LOOP. A function that has the
> update statements is called within the loop.
>
> The updates take too long.....is postgres slow in doing updates on
> large tables or is it because of the function call within the loop???
The short answer is, if you can rearrange your code so that you can
do fewer updates that each do more work then things will probably be
quicker.
Each round trip to the database is going to take a fixed amount of time,
so if you're waiting for the database to get back to you after you do
your update then this is going to be a constant cost on each iteration
of your loop.
Additionally, each transaction is going to take a fixed amount of time
to commit things to disk. Reducing the number of transactions the
database has to perform is generally a good thing for performance, but
if it's not what your application needs then you have to look elsewhere.
Sam
From | Date | Subject | |
---|---|---|---|
Next Message | Pettis, Barry | 2008-03-27 11:29:04 | Re: Using tables in other PostGreSQL database |
Previous Message | Volkan YAZICI | 2008-03-27 11:06:01 | Re: Performance of update |