From: | Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us> |
---|---|
To: | Greg Stark <gsstark(at)mit(dot)edu> |
Cc: | "Kevin Grittner" <Kevin(dot)Grittner(at)wicourts(dot)gov>, josh(at)agliodbs(dot)com, pgsql-hackers(at)postgresql(dot)org |
Subject: | Re: A costing analysis tool |
Date: | 2005-10-15 22:06:01 |
Message-ID: | 2887.1129413961@sss.pgh.pa.us |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-hackers |
Greg Stark <gsstark(at)mit(dot)edu> writes:
> If the optimizer didn't collapse the cost for each node into a single value
> and instead retained the individual parameters at each node it could bubble
> those values all the way up to the surface. Then use the configuration options
> like random_page_cost etc to calculate the resulting cost once.
Hardly --- how will you choose the best subplans if you don't calculate
their costs?
It might be possible to remember where the costs came from, but I'm
unconvinced that there's much gold to be mined that way.
I'm also a bit suspicious of the "it's all a linear equation" premise,
because the fact of the matter is that the cost estimates are already
nonlinear, and are likely to get more so rather than less so as we learn
more. A case in point is that the reason nestloop costing sucks so
badly at the moment is that it fails to account for cache effects in
repeated scans ... which is definitely a nonlinear effect.
regards, tom lane
From | Date | Subject | |
---|---|---|---|
Next Message | Greg Stark | 2005-10-15 22:27:36 | Re: slow IN() clause for many cases |
Previous Message | Greg Stark | 2005-10-15 21:53:45 | Re: A costing analysis tool |