Bad performance when inserting many data simultanously

From: hmidi slim <hmidi(dot)slim2(at)gmail(dot)com>
To: pgsql-general(at)postgresql(dot)org
Subject: Bad performance when inserting many data simultanously
Date: 2018-01-17 20:27:14
Message-ID: CAMsqVxtO8dtoa41ytDR38EYvcK0MocUkmtYmY-cd_5vjihb7kw@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

Hi,
I'm creating an application( microservice architecture using dokcer
containers) and I need to save a huge number of data in multiple tables in
the same time.
I have a table provider which has the different columns: name, status,
address, contact.
The table establishment contains: name, status, address, provider_id
(foreign key references to the table provider)
The table product contains: name, type,..., establishment_id(foreign key
references to the table establishment).
I have about 30000 objects to insert and I'm trying to save them using
batchs ( I save 1000 objects at time).
The process of the insertion took about 10 minutes. So I'm facing a problem
of performance here and I want to know if the process of saving huge amount
of data in postgress took a lot of ressources in background? Does this due
to bad conception of the app?

Browse pgsql-general by date

  From Date Subject
Next Message Jacek Kołodziej 2018-01-17 20:34:21 Re: READ COMMITTED vs. index-only scans
Previous Message Karen Stone 2018-01-17 19:00:54 RE: READ COMMITTED vs. index-only scans