From: | Dave Tenny <tenny(at)attbi(dot)com> |
---|---|
To: | Bruno Wolff III <bruno(at)wolff(dot)to> |
Cc: | Andreas Pflug <Andreas(dot)Pflug(at)web(dot)de>, pgsql-performance(at)postgresql(dot)org |
Subject: | Re: IN list processing performance (yet again) |
Date: | 2003-05-28 20:01:34 |
Message-ID: | 3ED5159E.2050407@attbi.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-performance |
Bruno Wolff III wrote:
>On Wed, May 28, 2003 at 14:08:02 -0400,
> Dave Tenny <tenny(at)attbi(dot)com> wrote:
>
>
>>Andreas Pflug wrote:
>>
>>I'm reminded to relay to the PostgreSQL devos that I might be able to do
>>more in the join or subquery department if
>>PostgreSQL had better performing MAX functions and a FIRST function for
>>selecting rows from groups.
>>("Performing" being the operative word here, since the extensible
>>architecture of PostgreSQL currently makes for poorly
>>performing MAX capabilities and presumably similar user defined
>>aggregate functions).
>>
>>
>
>Have you tried replacing max with a subselect that uses order by and limit?
>
>
I'm uncertain how that would work, since somewhere in there I still need
to elaborate on the
1000 items I want, and they're not necessarily in any particular range,
nor do they bear any
contiguous group nature.
Also, IN (subquery) is a known performance problem in PGSQL, at least if
the subquery is going to return many rows.
It's too bad, since I'm rather fond of subqueries, but I avoid them like
the plague in PostgreSQL.
Perhaps I don't understand what you had in mind.
From | Date | Subject | |
---|---|---|---|
Next Message | Dave Tenny | 2003-05-28 20:13:17 | Re: IN list processing performance (yet again) |
Previous Message | Dave Tenny | 2003-05-28 19:57:22 | Re: IN list processing performance (yet again) |