From: | Stephan Szabo <sszabo(at)megazone23(dot)bigpanda(dot)com> |
---|---|
To: | Dave Tenny <tenny(at)attbi(dot)com> |
Cc: | <pgsql-performance(at)postgresql(dot)org> |
Subject: | Re: IN list processing performance (yet again) |
Date: | 2003-05-28 14:54:26 |
Message-ID: | 20030528074020.G33203-100000@megazone23.bigpanda.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-performance |
On Wed, 28 May 2003, Dave Tenny wrote:
> Having grepped the web, it's clear that this isn't the first or last
> time this issue will be raised.
>
> My application relies heavily on IN lists. The lists are primarily
> constant integers, so queries look like:
>
> SELECT val FROM table WHERE id IN (43, 49, 1001, 100002, ...)
>
> Performance is critical, and the size of these lists depends a lot on
> how the larger 3-tier applicaiton is used,
> but it wouldn't be out of the question to retrieve 3000-10000 items.
>
> PostgreSQL 7.3.2 seems to have a lot of trouble with large lists.
It gets converted into a sequence like col=list[0] or col=list[1] and it
seems the planner/optimizer is taking at least a large amount of time for
me given that explain takes just over 80 seconds for a 9900 item list on
my machine (I don't have a data filled table to run the actual query
against).
The best plan may be generated right now from making a temporary
table, copying the values into it, and joining.
> 2) What is the expected acceptable limit for the number of items in an
> IN list predicate such as
> those used here. (List of constants, not subselects).
As a note, 7.4 by default seems to limit it to 10000 unless you up
max_expr_depth afaics.
From | Date | Subject | |
---|---|---|---|
Next Message | Chad Thompson | 2003-05-28 15:12:23 | >24 hour restore |
Previous Message | Mario Weilguni | 2003-05-28 14:31:37 | Re: IN list processing performance (yet again) |