From: | David Mitchell <david(dot)mitchell(at)telogis(dot)com> |
---|---|
To: | Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us> |
Cc: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: Vacuum advice |
Date: | 2005-06-23 01:47:45 |
Message-ID: | 42BA14C1.4030201@telogis.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
>
>>We're thinking we might set up vacuum_cost_limit to around 100 and put
>>vacuum_cost_delay at 100 and then just run vacuumdb in a cron job every
>>15 minutes or so, does this sound silly?
>
>
> It doesn't sound completely silly, but if you are doing inserts and not
> updates/deletes then there's not anything for VACUUM to do, really.
> An ANALYZE command might get the same result with less effort.
I think that perhaps the fact we are doing updates in the secondary
table to track the import is the culprit here. It gets updated for each
item inserted into the main table, so even though it has 500 rows, it
ended up with about 2million dead tuples, which left a lot to be desired
in terms of seq scan speed. Vacuum full cleared this up, so I assume a
frequent regular vacuum would keep it in tip top condition.
We are using PG 8.0.1.
Thanks for your help Tom.
--
David Mitchell
Software Engineer
Telogis
From | Date | Subject | |
---|---|---|---|
Next Message | William Yu | 2005-06-23 03:56:06 | Re: setting up PostgreSQL on Linux RHL9 to allow ODBC connections |
Previous Message | Tom Lane | 2005-06-23 01:37:51 | Re: PROBLEM: Function does not exist |