Re: parallel dump fails to dump large tables

From: Raymond O'Donnell <rod(at)iol(dot)ie>
To: Shanker Singh <ssingh(at)iii(dot)com>, "pgsql-general(at)postgresql(dot)org" <pgsql-general(at)postgresql(dot)org>
Subject: Re: parallel dump fails to dump large tables
Date: 2015-02-14 16:48:33
Message-ID: 54DF7C61.8060205@iol.ie
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

On 14/02/2015 15:42, Shanker Singh wrote:
> Hi,
> I am having problem using parallel pg_dump feature in postgres release
> 9.4. The size of the table is large(54GB). The dump fails with the
> error: "pg_dump: [parallel archiver] a worker process died
> unexpectedly". After this error the pg_dump aborts. The error log file
> gets the following message:
>
> 2015-02-09 15:22:04 PST [8636]: [2-1] user=pdroot,db=iii,appname=pg_dump
> STATEMENT: COPY iiirecord.varfield (id, field_type_tag, marc_tag,
> marc_ind1, marc_ind2, field_content, field_group_id, occ_num, record_id)
> TO stdout;
> 2015-02-09 15:22:04 PST [8636]: [3-1] user=pdroot,db=iii,appname=pg_dump
> FATAL: connection to client lost

There's your problem - something went wrong with the network.

Ray.

--
Raymond O'Donnell :: Galway :: Ireland
rod(at)iol(dot)ie

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Ramesh T 2015-02-14 16:58:36 Re: postgres cust types
Previous Message Ramesh T 2015-02-14 16:22:50 Re: dbmsscheduler