Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects

From: bricklen <bricklen(at)gmail(dot)com>
To: Giuseppe Broccolo <giuseppe(dot)broccolo(at)2ndquadrant(dot)it>
Cc: "pgsql-admin(at)postgresql(dot)org" <pgsql-admin(at)postgresql(dot)org>
Subject: Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects
Date: 2013-10-01 12:30:18
Message-ID: CAGrpgQ_safsytHcJyBwo2fT6Eu01=hJwjiZ2juac1vJQRqCjfg@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-admin pgsql-sql

On Tue, Oct 1, 2013 at 4:01 AM, Giuseppe Broccolo <
giuseppe(dot)broccolo(at)2ndquadrant(dot)it> wrote:

> Maybe you can performe your database changing some parameters properly:
>
> max_connections = 500 # (change requires restart)
>>
> Set it to 100, the highest value supported by PostgreSQL
>

Surely you mean that max_connections = 100 is the *default* ?

In response to

Browse pgsql-admin by date

  From Date Subject
Next Message Magnus Hagander 2013-10-01 12:49:33 Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects
Previous Message Sergey Klochkov 2013-10-01 11:12:18 Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects

Browse pgsql-sql by date

  From Date Subject
Next Message Magnus Hagander 2013-10-01 12:49:33 Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects
Previous Message Sergey Klochkov 2013-10-01 11:12:18 Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects