From: | Yuto Hayamizu <y(dot)hayamizu(at)gmail(dot)com> |
---|---|
To: | Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us> |
Cc: | Robert Haas <robertmhaas(at)gmail(dot)com>, Ashutosh Bapat <ashutosh(dot)bapat(at)enterprisedb(dot)com>, Thomas Munro <thomas(dot)munro(at)enterprisedb(dot)com>, Pg Hackers <pgsql-hackers(at)postgresql(dot)org> |
Subject: | Re: [HACKERS] [PATCH] Overestimated filter cost and its mitigation |
Date: | 2018-01-29 05:12:32 |
Message-ID: | CANE+7D9CJZc4RsJqq5oC7TptDyVPbrvex618deprGknkgsMqqA@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-hackers |
On Fri, Jan 19, 2018 at 5:07 PM, Yuto Hayamizu <y(dot)hayamizu(at)gmail(dot)com> wrote:
> My idea of improving this patch is that give a threshold N_limit,
> and for q_1 ... q_N_limit, do the same weighted cost estimation in the
> current version of this patch.
> For q_{N_limit+1} ...., stop calling clauselist_selectivity for
> calculating the weight
> and reuse the result of clauselist_selectivity({q_1,q_2, ..., q_N_limit}).
> For example, if N_limit=100, additional overhead is only
> sub-milliseconds per each range table entry,
> and cost estimation is surely better than the current postgres implementation.
Attached patch implemented the improvement idea above.
With this patch attached, performance degradation of the test query
with many quals was <1%.
Example test query is attached.
regards,
----
Yuto Hayamizu
Attachment | Content-Type | Size |
---|---|---|
Mitigate-filter-cost-overestimation-v3.patch | application/octet-stream | 10.6 KB |
testquery.sql | application/octet-stream | 32.7 KB |
From | Date | Subject | |
---|---|---|---|
Next Message | Ashutosh Bapat | 2018-01-29 05:12:57 | Re: CREATE ROUTINE MAPPING |
Previous Message | Craig Ringer | 2018-01-29 05:03:55 | Re: Linking PostgreSQL as a C++ program |