Large Table Updates, causing memory exhaustion. Is a transaction wrapper the answer?

From: "Michael Miyabara-McCaskey" <mykarz(at)miyabara(dot)com>
To: <pgsql-novice(at)postgresql(dot)org>
Subject: Large Table Updates, causing memory exhaustion. Is a transaction wrapper the answer?
Date: 2000-12-07 20:21:47
Message-ID: 007d01c0608b$53a8dcb0$aa00a8c0@ncc1701e
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-novice

Hello all,

I am new to the PostgreSQL world.

I am creating a new DB with ALOT of records, for instance one table that I
need to do updates on is about 5.5GB in size.

Doing a simple operation to uppercase the records, I keep exhausting the
memory of the backend. I'm thinking that writing my SQL statement within a
Transaction is the correct method... Is it? If so, what is the best way to
loop it through all the records, such that I again do not get into
exhausting the memory available.

The original SQL statement I have been using is
"UPDATE table_name SET field_1 = UPPER(field_1);"

Any help would be appreciated, and thank you in advance.

-Michael Miyabara-McCaskey

Responses

Browse pgsql-novice by date

  From Date Subject
Next Message Sterling 2000-12-07 21:18:53 Re: Uninstall Everything.
Previous Message rudy 2000-12-07 20:15:05 cursor