From: | Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us> |
---|---|
To: | Andrew Gierth <andrew(at)tao11(dot)riddles(dot)org(dot)uk> |
Cc: | pgsql-hackers(at)postgresql(dot)org |
Subject: | Re: Join Filter vs. Index Cond (performance regression 9.1->9.2+/HEAD) |
Date: | 2015-06-01 19:31:25 |
Message-ID: | 1603.1433187085@sss.pgh.pa.us |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-hackers |
Andrew Gierth <andrew(at)tao11(dot)riddles(dot)org(dot)uk> writes:
> "Tom" == Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us> writes:
> Tom> Once you're down to an estimate of one row retrieved, adding
> Tom> additional index conditions simply increases the cost (not by
> Tom> much, but it increases) without delivering any visible benefit.
> OK, but this is a serious problem because "estimate of one row" is a
> very common estimation failure mode, and isn't always solvable in the
> sense of arranging for better estimates (in the absence of hints, ugh).
Yeah. I've occasionally wondered about removing the clamp-to-one-row
behavior, so that additional conditions would still look like they
contributed something (ie, 0.1 row is better than 1 row). However,
that seems likely to break about as many cases as it fixes :-(.
A variant of that would be to only allow the minimum to be 1 row if
we are absolutely certain that's what we'll get (eg, we're searching
on a unique-key equality condition), and otherwise clamp to at least
2 rows. Again though, this would be destabilizing lots of cases that
work well today.
I doubt there are any simple solutions here.
regards, tom lane
From | Date | Subject | |
---|---|---|---|
Next Message | Thomas Munro | 2015-06-01 19:53:39 | Re: Multixid hindsight design |
Previous Message | Andrew Gierth | 2015-06-01 18:39:07 | Re: Really bad blowups with hash outer join and nulls |