BIG Data and Perl

From: Andy Lewis <alewis(at)roundnoon(dot)com>
To: pgsql-general(at)postgreSQL(dot)org
Subject: BIG Data and Perl
Date: 1999-10-15 14:52:11
Message-ID: Pine.LNX.4.05.9910150935320.22435-100000@rns.roundnoon.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

I've got a fairly good size database that has in one table around 50,000
records in it.

I'm using a perl script that is located on **another** machine in the same
network to access the DB with.

Once a week I have to do a "SELECT * ...." from one of the tables, get the
data, open another field from disk read in some of that data and finially
write it all together back to disk in small files that can be emailed out.

The query looks like: SELECT * from mytable order by member_id

-- cut --
$result = $conn->exec("$query");

$ntuples = $result->ntuples;
print STDOUT "Total: $ntuples \n\n";

while ( @row = $result->fetchrow ) {
do some stuff here...ie, open file and read
}
-- cut --

Here's the strange part and this could very well be anotherr part of this
script(I've inherited it).

It starts of and processes the first 300-400 rows fast and then gets
slower in time and eventually just quits. It'll run for about 4-6 hours
before it quits.

Any idea what may be going on here?

Thanks

Andy

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Sebestyen Zoltan 1999-10-15 14:52:34 php - pgsql connection
Previous Message Lincoln Spiteri 1999-10-15 14:43:03 Re: [GENERAL] Convert MS access database into PostgreSQL