I have a Postgis table with about 2 million polyline records. The most
number of points I have in the geometry field is about 500. I have a
simple DBD::Pg Perl program that does a select for most of these records
and do some processing with them before writing them to a file.
Unfortunately, I seem to keep getting this error:
DBD::Pg::st execute failed: out of memory for query result
DBD::Pg::st fetchrow_array failed: no statement executing
This program works fine with less than a million records for sure,
my next largest table. I believe the program is failing at the first
execute after prepare on this table.
Any ideas on how I can do this?
Thanks,
Tim