From: | "Dann Corbit" <DCorbit(at)connx(dot)com> |
---|---|
To: | "Tom Lane" <tgl(at)sss(dot)pgh(dot)pa(dot)us>, "Peter Bierman" <bierman(at)apple(dot)com> |
Cc: | <pgsql-hackers(at)postgresql(dot)org> |
Subject: | Re: Suggestion for optimization |
Date: | 2002-04-05 23:51:00 |
Message-ID: | D90A5A6C612A39408103E6ECDD77B82920CD1E@voyager.corporate.connx.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-hackers |
-----Original Message-----
From: Tom Lane [mailto:tgl(at)sss(dot)pgh(dot)pa(dot)us]
Sent: Friday, April 05, 2002 3:42 PM
To: Peter Bierman
Cc: Dann Corbit; pgsql-hackers(at)postgresql(dot)org
Subject: Re: [HACKERS] Suggestion for optimization
Peter Bierman <bierman(at)apple(dot)com> writes:
> ... Your comment: "An
> accurate cardinality figure can greatly enhance the optimizer's
> ability to perform joins in the correct order" was intriguing, and I'd
> be interested in Tom's thoughts on just that bit.
Approximate figures are quite sufficient for the planner's purposes.
AFAICS, making them exact would not improve the planning estimates
at all, because there are too many other sources of error. We have
approximate stats already via vacuum/analyze statistics gathering.
>>
What happens if someone deletes 75% of a table?
What happens if someone imports 30 times more rows than are already in
the table?
What happens if one table is remarkably small or even empty and you are
unaware?
In extreme cases, it can mean orders of magnitude performance
difference.
<<
From | Date | Subject | |
---|---|---|---|
Next Message | Josh Berkus | 2002-04-05 23:51:35 | Re: 16 parameter limit |
Previous Message | Tom Lane | 2002-04-05 23:41:32 | Re: Suggestion for optimization |