From: | Giuseppe Broccolo <giuseppe(dot)broccolo(at)2ndquadrant(dot)it> |
---|---|
To: | pgsql-admin(at)postgresql(dot)org |
Subject: | Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects |
Date: | 2013-10-01 11:01:39 |
Message-ID: | 524AAB93.7070308@2ndquadrant.it |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-admin pgsql-sql |
Maybe you can performe your database changing some parameters properly:
>
> PostgreSQL configuration:
>
> listen_addresses = '*' # what IP address(es) to listen on;
> port = 5432 # (change requires restart)
> max_connections = 500 # (change requires restart)
Set it to 100, the highest value supported by PostgreSQL
> shared_buffers = 16GB # min 128kB
This value should not be higher than 8GB
> temp_buffers = 64MB # min 800kB
> work_mem = 512MB # min 64kB
> maintenance_work_mem = 30000MB # min 1MB
Given RAM 96GB, you could set it up to 4800MB
> checkpoint_segments = 70 # in logfile segments, min 1,
> 16MB each
> effective_cache_size = 50000MB
Given RAM 96GB, you could set it up to 80GB
>
Hope it can help.
Giuseppe.
--
Giuseppe Broccolo - 2ndQuadrant Italy
PostgreSQL Training, Services and Support
giuseppe(dot)broccolo(at)2ndQuadrant(dot)it | www.2ndQuadrant.it
From | Date | Subject | |
---|---|---|---|
Next Message | Sergey Klochkov | 2013-10-01 11:12:18 | Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects |
Previous Message | Sergey Klochkov | 2013-10-01 10:46:16 | Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects |
From | Date | Subject | |
---|---|---|---|
Next Message | Sergey Klochkov | 2013-10-01 11:12:18 | Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects |
Previous Message | Sergey Klochkov | 2013-10-01 10:46:16 | Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects |