Re: abusing plpgsql array variables

From: Artacus <artacus(at)comcast(dot)net>
To: Ben <bench(at)silentmedia(dot)com>
Cc: PostgreSQL <pgsql-general(at)postgresql(dot)org>
Subject: Re: abusing plpgsql array variables
Date: 2008-09-11 05:19:07
Message-ID: 48C8AA4B.9050301@comcast.net
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general


> If I want to pass in a text[] argument to a plpgsql function, at what
> array size am I asking for problems? 100? 10,000? 100,000?
>
> What severity of problems might I encounter? Bad performance? Postgres
> refusing to run my query? A crashed backend?

Yeah, like you I was pretty worried about how it would handle using
larger arrays. But I was surprised to find that it did a super job of
handling even large arrays.

One warning though. If you are going to filter a table based on values
in a large array, don't do something like:

WHERE foo = ANY some_large_array

Instead explode it using a set returning function and join it like a table:

JOIN explode(some_large_array) e ON ...

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Greg Smith 2008-09-11 05:43:43 Re: psql scripting tutorials
Previous Message Artacus 2008-09-11 05:05:01 Re: using a GUI front end to postgres