any solution for doing a data file import spawning it on multiple processes

From: "hb(at)101-factory(dot)eu" <hb(at)101-factory(dot)eu>
To: postgres general support <pgsql-general(at)postgresql(dot)org>
Subject: any solution for doing a data file import spawning it on multiple processes
Date: 2012-06-16 15:04:42
Message-ID: 8AEB78AA-CF8E-4F0F-AECF-0333451B6C3A@101-factory.eu
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

hi there,

I am trying to import large data files into pg.
for now i used the. xarg linux command to spawn the file line for line and set and use the maximum available connections.

we use pg pool as connection pool to the database, and so try to maximize the concurrent data import of the file.

problem for now that it seems to work well but we miss a line once in a while, and that is not acceptable. also it creates zombies ;(.

does anybody have any other tricks that will do the job?

thanks,

Henk

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Edson Richter 2012-06-16 15:37:36 Re: any solution for doing a data file import spawning it on multiple processes
Previous Message Gabriele Bartolini 2012-06-16 06:13:46 Re: Smaller multiple tables or one large table?