Re: Thousands of schemas and ANALYZE goes out of memory

From: Jeff Janes <jeff(dot)janes(at)gmail(dot)com>
To: Hugo <hugo(dot)tech(at)gmail(dot)com>
Cc: pgsql-general(at)postgresql(dot)org
Subject: Re: Thousands of schemas and ANALYZE goes out of memory
Date: 2012-10-01 22:39:20
Message-ID: CAMkU=1y1BCDbGyuUtU8uCzQY89eurYhwes2Oe=bTBbiQ_-ZELQ@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

On Mon, Oct 1, 2012 at 12:52 PM, Hugo <Nabble> <hugo(dot)tech(at)gmail(dot)com> wrote:
> Hi everyone,
>
> We have two postgresql 9.0 databases (32-bits)

Why 32 bits? Is that what your hardware is?

> with more than 10,000
> schemas. When we try to run ANALYZE in those databases we get errors like
> this (after a few hours):
>
> 2012-09-14 01:46:24 PDT ERROR: out of memory
> 2012-09-14 01:46:24 PDT DETAIL: Failed on request of size 421.
> 2012-09-14 01:46:24 PDT STATEMENT: analyze;
>
> (Note that we do have plenty of memory available for postgresql:
> shared_buffers=2048MB, work_mem=128MB, maintenance_work_mem=384MB,
> effective_cache_size = 3072MB, etc.)

That might be the problem. I think with 32 bits, you only 2GB of
address space available to any given process, and you just allowed
shared_buffers to grab all of it.

Cheers,

Jeff

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message David Johnston 2012-10-01 22:56:46 Re: How to search for composite type array
Previous Message yary 2012-10-01 22:02:09 Re: Pg, Netezza, and... Sybase?