From: | Scott Marlowe <scott(dot)marlowe(at)gmail(dot)com> |
---|---|
To: | "Lee, Mija" <mija(at)scharp(dot)org> |
Cc: | pgsql-admin(at)postgresql(dot)org |
Subject: | Re: out of memory for query, partitioning & vacuuming |
Date: | 2009-07-15 19:16:04 |
Message-ID: | dcc563d10907151216o7bcd71c2gf0db8b136b001789@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-admin |
On Wed, Jul 15, 2009 at 12:59 PM, Lee, Mija<mija(at)scharp(dot)org> wrote:
> Hi -
> I'm not a particularly experienced dba, so I'm hoping this isn't a
> ridiculous question.
> I have a 5 GB table with lots of churn in a 14 GB database. Querying
> this one table without limits has just started throwing "out of memory
> for query" from multiple clients (psql, java).
If you're getting this error on the client side (i.e. it's not message
coming from pgsql, it's coming from the client software) nothing you
do on the server configuration / partitioning wise is going to help.
Accessing the data via a cursor is usually the best way around that error.
From | Date | Subject | |
---|---|---|---|
Next Message | Anj Adu | 2009-07-15 19:18:01 | Re: out of memory for query, partitioning & vacuuming |
Previous Message | Lee, Mija | 2009-07-15 18:59:23 | out of memory for query, partitioning & vacuuming |