Re: [8.0.0] out of memory on large UPDATE

From: "Marc G(dot) Fournier" <scrappy(at)postgresql(dot)org>
To: Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us>
Cc: pgsql-bugs(at)postgresql(dot)org
Subject: Re: [8.0.0] out of memory on large UPDATE
Date: 2005-08-11 16:54:46
Message-ID: 20050811135329.E1002@ganymede.hub.org
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-bugs

On Thu, 11 Aug 2005, Tom Lane wrote:

> "Marc G. Fournier" <scrappy(at)postgresql(dot)org> writes:
>> The table contains ~10 million rows:
>
>> # time psql -c "UPDATE xa_url SET url = url;" -U pgsql pareto
>> ERROR: out of memory
>> DETAIL: Failed on request of size 32.
>
> If you've got any AFTER UPDATE triggers on that table, you could be
> running out of memory for the pending-triggers list.

Nope, only have a BEFORE UPDATE, or would that be similar except for at
which point it runs out of memory?

Triggers:
xa_url_domain_b_i_u BEFORE INSERT OR UPDATE ON xa_url FOR EACH ROW EXECUTE PROCEDURE xa_url_domain()

----
Marc G. Fournier Hub.Org Networking Services (http://www.hub.org)
Email: scrappy(at)hub(dot)org Yahoo!: yscrappy ICQ: 7615664

In response to

Responses

Browse pgsql-bugs by date

  From Date Subject
Next Message Tom Lane 2005-08-11 17:27:23 Re: [8.0.0] out of memory on large UPDATE
Previous Message Alvaro Herrera 2005-08-11 16:52:38 Re: BUG #1800: "unexpected chunk number" during pg_dump