From: | Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us> |
---|---|
To: | James Cranch <jdc41(at)cam(dot)ac(dot)uk> |
Cc: | pgsql-performance(at)postgresql(dot)org |
Subject: | Re: Rapidly finding maximal rows |
Date: | 2011-10-14 04:25:53 |
Message-ID: | 13279.1318566353@sss.pgh.pa.us |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-performance |
James Cranch <jdc41(at)cam(dot)ac(dot)uk> writes:
> I have a slow query. I expect that there is either a more intelligent way
> to write it, or that one could make some indexes that would speed it up.
> I've tried various indexes, and am not getting anywhere.
> I'd be grateful for any suggestions. Reasonably full details are below.
Two bits of advice:
1. Avoid unnecessary use of WITH. It acts as an optimization fence,
which you don't want here. In particular, the only way to avoid
sorting/aggregating over the whole table is for the outer query's WHERE
conditions on competition_name and academic_year_beginning to get pushed
down into the scans on challenge_entries ... and that can't happen if
there's a WITH in between.
2. Try increasing work_mem. I think that your first view would work all
right if it had enough work_mem to go for a HashAgg plan instead of
sort-and-group, even without pushdown of the outer WHERE. It'd
definitely be faster than what you've got, anyway.
The other approach with a window function is probably a lost cause.
Postgres hasn't got a lot of intelligence about optimizing
window-function queries ...
regards, tom lane
From | Date | Subject | |
---|---|---|---|
Next Message | Tom Lane | 2011-10-14 04:37:48 | Re: Join over two tables of 50K records takes 2 hours |
Previous Message | davidsarmstrong | 2011-10-13 19:34:09 | Re: Slow query when using ORDER BY *and* LIMIT |