Re: pg_dump out of memory

From: George Neuner <gneuner2(at)comcast(dot)net>
To: pgsql-general(at)postgresql(dot)org
Subject: Re: pg_dump out of memory
Date: 2018-07-04 04:39:12
Message-ID: 5ajojdh1qdj8ieftktcp95fo8hcq96m9pj@4ax.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

On Tue, 3 Jul 2018 21:43:38 -0500, Andy Colson <andy(at)squeakycode(dot)net>
wrote:

>Hi All,
>
>I moved a physical box to a VM, and set its memory to 1Gig. Everything
>runs fine except one backup:
>
>
>/pub/backup# pg_dump -Fc -U postgres -f wildfire.backup wildfirep
>
>g_dump: Dumping the contents of table "ofrrds" failed: PQgetResult() failed.
>pg_dump: Error message from server: ERROR: out of memory
>DETAIL: Failed on request of size 1073741823.
^^^^^^^^^^

pg_dump is trying to allocate 1GB. Obviously it can't if 1GB is all
you have.

>pg_dump: The command was: COPY public.ofrrds (id, updateddate, bytes) TO
>stdout;
>
>wildfire=# \dt+ ofrrds
> List of relations
> Schema | Name | Type | Owner | Size | Description
>--------+--------+-------+-------+-------+-------------
> public | ofrrds | table | andy | 15 MB |
>
>
>wildfire=# \d ofrrds
> Table "public.ofrrds"
> Column | Type | Modifiers
>-------------+------------------------+-----------
> id | character varying(100) | not null
> updateddate | bigint | not null
> bytes | bytea |
>Indexes:
> "ofrrds_pk" PRIMARY KEY, btree (id)
>

There must be a heck of a lot of data in that bytea column.

>I'm not sure how to get this backup to run. Any hints would be appreciated.

As Adrian mentioned already, you're going to have to give it more
memory somehow. Either more RAM or a big swap file.

George

In response to

Browse pgsql-general by date

  From Date Subject
Next Message jbrant 2018-07-04 05:17:43 Re: Parallel Aware
Previous Message Adrian Klaver 2018-07-04 04:04:43 Re: pg_dump out of memory