From: | marcin mank <marcin(dot)mank(at)gmail(dot)com> |
---|---|
To: | Noah Misch <noah(at)leadboat(dot)com> |
Cc: | Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us>, Kevin Grittner <Kevin(dot)Grittner(at)wicourts(dot)gov>, Robert Haas <robertmhaas(at)gmail(dot)com>, pgsql-hackers(at)postgresql(dot)org, Andres Freund <andres(at)anarazel(dot)de> |
Subject: | Re: *_collapse_limit, geqo_threshold |
Date: | 2009-07-14 10:18:56 |
Message-ID: | b1b9fac60907140318o26cd4ed9wd0b112fcb8b0657d@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-hackers |
On Thu, Jul 9, 2009 at 5:38 AM, Noah Misch<noah(at)leadboat(dot)com> wrote:
z
> Describing in those terms illuminates much. While the concepts do suggest 2^N
> worst-case planning cost, my artificial test case showed a rigid 4^N pattern;
> what could explain that?
>
Isn`t that just so that the planner has to examine O(2^N) subsets of
relations, and do O(2^N) work for each of them? To create level N join
the planner chooses pairs of level k and level N-k joins. the count of
level k joins is O(2^k), the count of level N-k ones is O(2^(N-k)).
Together it is O(N) * O(2^N) * O(2^k) * O(2^(N-k)) which is O(N* 4^N)
.
This is for the worst case. If we could make a better estimate of the
required planning time (I believe that the input data for a good
heuristic is a matrix which says which relation is constrained to
which relation), we could make better decisions about when to flatten
subqueries, collapse joins, launch geqo...
Greetings
Marcin
From | Date | Subject | |
---|---|---|---|
Next Message | Tsutomu Yamada | 2009-07-14 10:22:42 | [PATCH] "could not reattach to shared memory" on Windows |
Previous Message | Itagaki Takahiro | 2009-07-14 09:47:01 | Sampling profiler updated |