From: | Charles Tassell <ctassell(at)isn(dot)net> |
---|---|
To: | pgsql-general(at)postgreSQL(dot)org |
Subject: | Re: [GENERAL] BIG Data and Perl |
Date: | 1999-10-18 03:31:21 |
Message-ID: | 4.1.19991018002646.00992850@mailer.isn.net |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
This is slightly unrelated (well, maybe more than slightly) but what is the
advantage to using cursors over normal SELECT statements? I know from
experience that just using an execute("SELECT...") and fetchrow_array
doesn't go wild with memory usage, as long as you remember to close your
statement handles.
At 11:48 PM 10/17/99, Lincoln Yeoh wrote:
>At 09:52 AM 15-10-1999 -0500, Andy Lewis wrote:
>>I've got a fairly good size database that has in one table around 50,000
>>records in it.
>>
>>It starts of and processes the first 300-400 rows fast and then gets
>>slower in time and eventually just quits. It'll run for about 4-6 hours
>>before it quits.
>>
>>Any idea what may be going on here?
>
>Maybe you're running out of memory. Your perl script may be reading too
>much into memory.
>
>When using the perl DBI module, I get the impression that the perl script
>reads in all the results when you do
>$cursor->execute
>
>I don't know if there are any ways around this. It can be a bit
>inconvenient if the result is large ;).
>
>Cheerio,
>
>Link.
>
>
>************
>
From | Date | Subject | |
---|---|---|---|
Next Message | Carsten Huettl | 1999-10-18 04:19:58 | user 'postgres' is not in 'pg_shadow' |
Previous Message | Lincoln Yeoh | 1999-10-18 02:48:01 | Re: [GENERAL] BIG Data and Perl |