From: | Morten Sickel <Morten(dot)Sickel(at)nrpa(dot)no> |
---|---|
To: | 'Jodi Kanter' <jkanter(at)virginia(dot)edu>, Postgres Admin List <pgsql-admin(at)postgresql(dot)org> |
Subject: | Re: slow inserts |
Date: | 2002-03-25 09:23:10 |
Message-ID: | 54DE9A561AD20C4D9FF88B116965420E029F9C@postix.nrpa.no |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-admin |
Jodi Kanter wrote:
> I am currently using a Perl data loader that was set up to load data to
three particular
> tables.
(Snip)
> I have placed some debugging syntax in the code and it seems that the
extra time if
> related to postgres as I had originally thought it may have to do with
the parsing of
> the Excel file.
You don't mention it, but I assume you are using DBI/pg. You are sure you
are setting up you
insert qyery handlers once and then reuse them? >s setting up a query
handler takes a lot of time.
ie
my $dbh=DBI->connect(dbi:Pg ...);
my $insh = $dbh->prepare("Insert into table values (?,?,?)";
foreach ($excelrow){
parse;
$insh->execute($data1,$data2,$data3);
}
I have written a few script of that kind my self, and I was really surprised
how much it mattered when I managed to move a $dbi->prepare out of the
insert loop.
regards
--
Morten Sickel
Norwegian Radiation Protection Authority
From | Date | Subject | |
---|---|---|---|
Next Message | Fred Moyer | 2002-03-25 09:46:00 | Re: large table support 32,000,000 rows |
Previous Message | Auri Mason | 2002-03-25 09:07:52 | Re: unable to locate a valid checkpoint |