Re: psql vs perl prepared inserts

From: Sean Davis <sdavis2(at)mail(dot)nih(dot)gov>
To: Dawid Kuroczko <qnex42(at)gmail(dot)com>
Cc: pgsql-general(at)postgresql(dot)org
Subject: Re: psql vs perl prepared inserts
Date: 2005-04-13 10:11:32
Message-ID: 5f32a4ab68b004b6e6346949123930b3@mail.nih.gov
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general


On Apr 13, 2005, at 4:12 AM, Dawid Kuroczko wrote:

> On 4/12/05, Matt Van Mater <matt(dot)vanmater(at)gmail(dot)com> wrote:
>> I've been experimenting with loading a large amount of data into a
>> fairly simple database using both psql and perl prepared statements.
>> Unfortunately I'm seeing no appreciable differences between the two
>> methods, where I was under the impression that prepared statements
>> should be much faster (in my case, they are slightly slower).
>
> I've been playing with similar issue and in my case the best solution
> for bulk insert was using perl to format data in form suitable for COPY
> command.

I second this approach. Generally, getting the data into the database
can be done VERY quickly (for the 18k rows you have, it would likely be
instantaneous to copy them). I often create a separate "loader" schema
into which I load text files. Then, I can use SQL, triggers, or
functions to "clean up" the data, enforce referential integrity, etc.
within the database. If you have perl code to do this, you can
probably modify it just slightly to be used in a pl/perl function to do
the same thing as before, but now it is done on the server side and
will probably be significantly faster.

Sean

In response to

Browse pgsql-general by date

  From Date Subject
Next Message Robert Treat 2005-04-13 11:48:30 Re: What are the consequences of a bad database design (never seen that before !)
Previous Message Dawid Kuroczko 2005-04-13 08:12:50 Re: psql vs perl prepared inserts