From: | Guido Neitzer <guido(dot)neitzer(at)pharmaline(dot)de> |
---|---|
To: | Radovan Antloga <radovan(dot)antloga(at)siol(dot)net> |
Cc: | pgsql-performance(at)postgresql(dot)org |
Subject: | Re: Performance decrease |
Date: | 2006-04-20 21:19:44 |
Message-ID: | B5FFB3A0-7702-4926-A7CA-656CFB580B42@pharmaline.de |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-hackers pgsql-performance |
On 20.04.2006, at 18:10 Uhr, Radovan Antloga wrote:
> I have once or twice a month update on many records (~6000) but
> not so many. I did not expect PG would have problems with
> updating 15800 records.
It has no problems with that. We have a database where we often
update/insert rows with about one hundred columns. No problem so far.
Performance is in the sub 10ms range. The whole table has about
100000 records.
Do you wrap every update in a separate transaction? I do commits
every 200 updates for bulk updates.
cug
--
PharmaLine, Essen, GERMANY
Software and Database Development
From | Date | Subject | |
---|---|---|---|
Next Message | Jim C. Nasby | 2006-04-20 21:28:19 | Re: [HACKERS] parser error when trying to connect to postges db from tomcat |
Previous Message | Martijn van Oosterhout | 2006-04-20 21:12:55 | Re: Google SoC--Idea Request |
From | Date | Subject | |
---|---|---|---|
Next Message | Tom Lane | 2006-04-20 21:19:48 | Re: Recovery will take 10 hours |
Previous Message | Jeff Frost | 2006-04-20 21:19:43 | Re: Recovery will take 10 hours |