From: | "Heikki Linnakangas" <heikki(at)enterprisedb(dot)com> |
---|---|
To: | "Sven Geisler" <sgeisler(at)aeccom(dot)com> |
Cc: | <pgsql-performance(at)postgresql(dot)org> |
Subject: | Re: single transaction vs multiple transactions |
Date: | 2006-12-05 15:30:43 |
Message-ID: | 457590A3.2090006@enterprisedb.com |
Views: | Whole Thread | Raw Message | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-performance |
Sven Geisler wrote:
> I have to refactoring a 'DELETE FROM x WHERE y IN (...)' because IN got
> to much parameters. => 'stack depth limit exceeded'
> I don't want to increase just the parameter for max_stack_depth. It is
> better to refactoring because the number of arguments to IN may increase
> in the future.
>
> My approach is to do multiple 'DELETE FROM x WHERE y=...'.
You could also do something in between, issuing the deletes in batches
of say 100 deletes each. But using a temporary table is much better.
> My question is now, what is better for PostgreSQL from a performance
> perspective?
> 1. all multiple deletes in one transaction
> 2. each delete in its own transaction
All in one transaction is definitely faster.
> The number of arguments is around 10,000.
>
> BTW: The arguments are generate in the application tier. I would have to
> create a temporary table which I can use in 'DELETE FROM x WHERE y IN
> (SELECT z FROM tmp)'.
I think that's exactly what you should do.
--
Heikki Linnakangas
EnterpriseDB http://www.enterprisedb.com
From | Date | Subject | |
---|---|---|---|
Next Message | Sven Geisler | 2006-12-05 15:35:19 | Re: single transaction vs multiple transactions |
Previous Message | Sven Geisler | 2006-12-05 15:26:55 | single transaction vs multiple transactions |