Re: large numbers of inserts out of memory strategy

From: Ted Toth <txtoth(at)gmail(dot)com>
To: Tomas Vondra <tomas(dot)vondra(at)2ndquadrant(dot)com>
Cc: pgsql-general <pgsql-general(at)postgresql(dot)org>
Subject: Re: large numbers of inserts out of memory strategy
Date: 2017-11-28 18:26:20
Message-ID: CAFPpqQGJ9GkntRrjEqvHCRtkbHojmTy93LmCpfuMGJj5F2O7-A@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

On Tue, Nov 28, 2017 at 12:01 PM, Tomas Vondra
<tomas(dot)vondra(at)2ndquadrant(dot)com> wrote:
>
>
> On 11/28/2017 06:54 PM, Ted Toth wrote:
>> On Tue, Nov 28, 2017 at 11:22 AM, Tomas Vondra
>> <tomas(dot)vondra(at)2ndquadrant(dot)com> wrote:
>>> Hi,
>>>
>>> On 11/28/2017 06:17 PM, Ted Toth wrote:
>>>> I'm writing a migration utility to move data from non-rdbms data
>>>> source to a postgres db. Currently I'm generating SQL INSERT
>>>> statements involving 6 related tables for each 'thing'. With 100k or
>>>> more 'things' to migrate I'm generating a lot of statements and when I
>>>> try to import using psql postgres fails with 'out of memory' when
>>>> running on a Linux VM with 4G of memory. If I break into smaller
>>>> chunks say ~50K statements then thde import succeeds. I can change my
>>>> migration utility to generate multiple files each with a limited
>>>> number of INSERTs to get around this issue but maybe there's
>>>> another/better way?
>>>>
>>>
>>> The question is what exactly runs out of memory, and how did you modify
>>> the configuration (particularly related to memory).
>>>
>>> regards
>>>
>>> --
>>> Tomas Vondra http://www.2ndQuadrant.com
>>> PostgreSQL Development, 24x7 Support, Remote DBA, Training & Services
>>
>> I'm pretty new to postgres so I haven't changed any configuration
>> setting and the log is a bit hard for me to make sense of :(
>>
>
> The most interesting part of the log is this:
>
> SPI Proc: 2464408024 total in 279 blocks; 1672 free (1 chunks);
> 2464406352 used
> PL/pgSQL function context: 537911352 total in 74 blocks; 2387536
> free (4 chunks); 535523816 used
>
>
> That is, most of the memory is allocated for SPI (2.4GB) and PL/pgSQL
> procedure (500MB). How do you do the load? What libraries/drivers?
>
> regards
>
> --
> Tomas Vondra http://www.2ndQuadrant.com
> PostgreSQL Development, 24x7 Support, Remote DBA, Training & Services

I'm doing the load with 'psql -f'. I using 9.6 el6 rpms on a Centos VM
I downloaded from the postgres repo.

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Tomas Vondra 2017-11-28 18:38:37 Re: large numbers of inserts out of memory strategy
Previous Message Ted Toth 2017-11-28 18:23:56 Re: large numbers of inserts out of memory strategy