Re: large numbers of inserts out of memory strategy

From: Steven Lembark <lembark(at)wrkhors(dot)com>
To: pgsql-general(at)lists(dot)postgresql(dot)org
Cc: lembark(at)wrkhors(dot)com
Subject: Re: large numbers of inserts out of memory strategy
Date: 2017-11-29 17:02:10
Message-ID: 20171129110210.59a57d4d@wrkhors.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general


> > what tools / languages ate you using?
>
> I'm using python to read binary source files and create the text files
> contains the SQL. Them I'm running psql -f <file containing SQL>.

Then chunking the input should be trivial.
There are a variety of techniques you can use to things like disable
indexes during loading, etc. Maybe load them into temp tables and
then insert the temp's into the destination tables. The point is to
amortize the memory load over the entire load period.

--
Steven Lembark 1505 National Ave
Workhorse Computing Rockford, IL 61103
lembark(at)wrkhors(dot)com +1 888 359 3508

In response to

Browse pgsql-general by date

  From Date Subject
Next Message Steven Lembark 2017-11-29 17:03:17 Re: large numbers of inserts out of memory strategy
Previous Message Nicola Contu 2017-11-29 15:39:41 pg_replication_slots