Hi,
I'm creating an application( microservice architecture using dokcer
containers) and I need to save a huge number of data in multiple tables in
the same time.
I have a table provider which has the different columns: name, status,
address, contact.
The table establishment contains: name, status, address, provider_id
(foreign key references to the table provider)
The table product contains: name, type,..., establishment_id(foreign key
references to the table establishment).
I have about 30000 objects to insert and I'm trying to save them using
batchs ( I save 1000 objects at time).
The process of the insertion took about 10 minutes. So I'm facing a problem
of performance here and I want to know if the process of saving huge amount
of data in postgress took a lot of ressources in background? Does this due
to bad conception of the app?