From: | "Mark Dexter" <MDEXTER(at)dexterchaney(dot)com> |
---|---|
To: | "Richard Huxton" <dev(at)archonet(dot)com> |
Cc: | <pgsql-general(at)postgresql(dot)org> |
Subject: | Re: VACUUM and ANALYZE With Empty Tables |
Date: | 2004-11-24 17:02:45 |
Message-ID: | 5E8F9F5B63726C48836757FE673B584E180AA8@dcimail.dexterchaney.local |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Thanks very much for the information. It would appear that our best option might be to vacuum analyze these tables in our application at a point in time when they contain rows instead of doing it at night. Needlesst to say, it would nice to have an option to analyze with a target number of rows instead of the number presently in the table.
I suppose another option would be to keep a small number of rows permanently in these tables. In my testing, 100 rows (94 to be exact) did the trick. Is this number going to vary from table to table?
Thanks again for your help. Mark
________________________________
From: Richard Huxton [mailto:dev(at)archonet(dot)com]
Sent: Wed 11/24/2004 1:26 AM
To: Mark Dexter
Cc: pgsql-general(at)postgresql(dot)org
Subject: Re: [GENERAL] VACUUM and ANALYZE With Empty Tables
Mark Dexter wrote:
> We use a development environment that works with Postgres via ODBC and
> uses cursors to insert and update rows in Postgres tables. I'm using
> Postgres version 7.4.5.
> A. If I TRUNCATE or DELETE all of the rows in the table and then run
> VACUUM or ANALYZE on the empty table, the test program takes over 15
> minutes to complete (i.e., 15X performance drop).
> If we routinely run VACUUM or VACUUM ANALYZE (e.g., nightly), these work
> tables will normally be empty when the VACUUM is run. So it would
> appear from the testing above that they will experience performance
> problems when inserting large numbers of rows through our application.
Yep - it's a known issue. The analyse is doing what you asked, it's just
not what you want.
> Is there some easy way around this problem? If there a way to force
> VACUUM or ANALYZE to optimize for a set number of rows even if the table
> is empty when it is run? Thanks for your help. Mark
There are only two options I know of:
1. Vaccum analyse each table separately (tedious, I know)
2. Try pg_autovacuum in the contrib/ directory
The autovacuum utility monitors activity for you and targets tables when
they've seen a certain amount of activity. Even if it hasn't got the
tunability you need, it should be a simple patch to add a list of
"excluded" tables.
--
Richard Huxton
Archonet Ltd
From | Date | Subject | |
---|---|---|---|
Next Message | Tom Lane | 2004-11-24 17:03:24 | Re: Moving/Using Postgres Binaries on multiple machines |
Previous Message | Tino Wildenhain | 2004-11-24 16:39:53 | Re: Nesting Stored Procedure Calls |