Re: large numbers of inserts out of memory strategy

From: Ted Toth <txtoth(at)gmail(dot)com>
To: Tomas Vondra <tomas(dot)vondra(at)2ndquadrant(dot)com>
Cc: pgsql-general <pgsql-general(at)postgresql(dot)org>
Subject: Re: large numbers of inserts out of memory strategy
Date: 2017-11-28 17:54:11
Message-ID: CAFPpqQHY39wDEQm7JeBD6JSN_UgNi-19kdGdZKmhQ2gxTpw8Qw@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

On Tue, Nov 28, 2017 at 11:22 AM, Tomas Vondra
<tomas(dot)vondra(at)2ndquadrant(dot)com> wrote:
> Hi,
>
> On 11/28/2017 06:17 PM, Ted Toth wrote:
>> I'm writing a migration utility to move data from non-rdbms data
>> source to a postgres db. Currently I'm generating SQL INSERT
>> statements involving 6 related tables for each 'thing'. With 100k or
>> more 'things' to migrate I'm generating a lot of statements and when I
>> try to import using psql postgres fails with 'out of memory' when
>> running on a Linux VM with 4G of memory. If I break into smaller
>> chunks say ~50K statements then thde import succeeds. I can change my
>> migration utility to generate multiple files each with a limited
>> number of INSERTs to get around this issue but maybe there's
>> another/better way?
>>
>
> The question is what exactly runs out of memory, and how did you modify
> the configuration (particularly related to memory).
>
> regards
>
> --
> Tomas Vondra http://www.2ndQuadrant.com
> PostgreSQL Development, 24x7 Support, Remote DBA, Training & Services

I'm pretty new to postgres so I haven't changed any configuration
setting and the log is a bit hard for me to make sense of :(

Attachment Content-Type Size
psql.outofmem.log application/octet-stream 24.7 KB

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Rob Sargent 2017-11-28 17:55:54 Re: large numbers of inserts out of memory strategy
Previous Message Ted Toth 2017-11-28 17:50:18 Re: large numbers of inserts out of memory strategy