From: | Chris Mair <chris(at)1006(dot)org> |
---|---|
To: | Job <Job(at)colliniconsulting(dot)it>, "pgsql-general(at)postgresql(dot)org" <pgsql-general(at)postgresql(dot)org> |
Subject: | Re: Query killed with Out of memory |
Date: | 2016-10-04 18:03:09 |
Message-ID: | 182da2374c6b79edce4f5baabc05afdb@smtp.hushmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
> With a heavy query, when line number results raise over 600k query hangs with out of memory.
>
> Here is the explain analyze:
> [...]
> Work_mem is.512mb, shared buffers 3084mb and system Ram 10Gb. Postgres version is 8.4.8 and for some months i cannot upgrade.
>
> Is there a way to solve the problem?
Hi,
a few ideas:
- what goes out of memory? The client? if you query from a programming language you should set the fetch/batch size
to some value so that it won't fetch the whole 600k rows into memory... for the psql client you can do
\set FETCH_COUNT 1000
- work_mem 512 MB is high-ish unless you have a small value for max_connection...
- 8.4.8 was released in 2011, the latest 8.4 release is 8.4.22, you'r missing lots of patches (and 8.4 was EOLed more
than two years ago)
Bye,
Chris.
From | Date | Subject | |
---|---|---|---|
Next Message | Shaun McCready | 2016-10-04 19:46:48 | postgresql 9.5 upsert issue with nulls |
Previous Message | Job | 2016-10-04 16:10:43 | Query killed with Out of memory |