Re: large numbers of inserts out of memory strategy

From: Tomas Vondra <tomas(dot)vondra(at)2ndquadrant(dot)com>
To: Ted Toth <txtoth(at)gmail(dot)com>, pgsql-general <pgsql-general(at)postgresql(dot)org>
Subject: Re: large numbers of inserts out of memory strategy
Date: 2017-11-28 17:22:17
Message-ID: ad918f32-4913-3dee-281a-5a3fee576a14@2ndquadrant.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

Hi,

On 11/28/2017 06:17 PM, Ted Toth wrote:
> I'm writing a migration utility to move data from non-rdbms data
> source to a postgres db. Currently I'm generating SQL INSERT
> statements involving 6 related tables for each 'thing'. With 100k or
> more 'things' to migrate I'm generating a lot of statements and when I
> try to import using psql postgres fails with 'out of memory' when
> running on a Linux VM with 4G of memory. If I break into smaller
> chunks say ~50K statements then thde import succeeds. I can change my
> migration utility to generate multiple files each with a limited
> number of INSERTs to get around this issue but maybe there's
> another/better way?
>

The question is what exactly runs out of memory, and how did you modify
the configuration (particularly related to memory).

regards

--
Tomas Vondra http://www.2ndQuadrant.com
PostgreSQL Development, 24x7 Support, Remote DBA, Training & Services

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Ted Toth 2017-11-28 17:50:18 Re: large numbers of inserts out of memory strategy
Previous Message Rob Sargent 2017-11-28 17:19:27 Re: large numbers of inserts out of memory strategy