From: | Harald Fuchs <hari(dot)fuchs(at)googlemail(dot)com> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: generate_series woes |
Date: | 2008-04-16 11:55:10 |
Message-ID: | puskxmypy9.fsf@srv.protecting.net |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
In article <b42b73150804150715r83cad1doa166230ec509f0d(at)mail(dot)gmail(dot)com>,
"Merlin Moncure" <mmoncure(at)gmail(dot)com> writes:
> On Mon, Apr 14, 2008 at 5:21 AM, Harald Fuchs <hari(dot)fuchs(at)googlemail(dot)com> wrote:
>> I think there's something sub-optimal with generate_series.
>> In the following, "documents" is a table with more than 120000 rows,
>> vacuumed and analyzed before the queries.
> everything is working exactly as intended. while it's obvious to you
> that the generate series function returns a particular number of rows
> based on your supplied inputs, it's not (yet) obvious to the planner.
Which was exactly my point. Since generate_series is a builtin
function, the planner could theoretically know the number of rows
returned, thus choosing a better plan.
OTOH, the difference between theory and reality is in theory smaller
than in reality.
> your genser function supplies the hint the planner needs and it
> adjusts the plan. most set returning functions (particularly
> non-immutable ones) are not so easy to determine the # of rows from
> the input parameters anyways.
Yes, of course. I used "genser" just to show that there is a better plan.
From | Date | Subject | |
---|---|---|---|
Next Message | hubert depesz lubaczewski | 2008-04-16 12:20:16 | Re: generate_series woes |
Previous Message | Rob Collins | 2008-04-16 11:13:01 | Master-master replication with PostgreSQL |