From: | André Volpato <andre(dot)volpato(at)ecomtecnologia(dot)com(dot)br> |
---|---|
To: | Mike Ginsburg <mginsburg(at)collaborativefusion(dot)com> |
Cc: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: Large Result and Memory Limit |
Date: | 2007-10-04 20:12:27 |
Message-ID: | 4705492B.7030807@ecomtecnologia.com.br |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Mike Ginsburg escreveu:
> André Volpato wrote:
>> Mike Ginsburg escreveu:
>>> Hello,
>>> I am working on a personnel registry that has upwards of 50,000
>>> registrants. Currently I am working on an export module that will
>>> create a CSV from multiple tables. I have managed to keep the
>>> script (PHP) under the memory limit
>> okay... some info needed here.
>> 1. memory on the DB server
>> 2. memory_limit on php.ini
> PHP Memory Limit is 16M. We're running multiple installations on a
> single webserver, so memory is a concern
> DB Server is separate from the webserver.
>>> when creating and inserting the CSV into the database. The problem
>>> comes when I try to query for the data and export it. Memory limit
>>> is a major concern, but the query for one row returns a result set
>>> too large and PHP fails.
>> a single row is enough to crash PHP ?
> Well the "data" field in the table (text) contains 50K lines. It's
> over 30M in size for the full export.
>>
>>>
>>> I've thought about storing the data in multiple rows and then
>>> querying one-by-one and outputting, but was hoping there was a
>>> better way.
>> if you can´t raise memory_limit, I think it´s the only way.
> I was afraid that would be the answer.
Well, you could try to retrieve data with substr(), say 10k lines in 5
queries.
It´s kinda ugly, but should work in this case...
[]´s
ACV
From | Date | Subject | |
---|---|---|---|
Next Message | André Volpato | 2007-10-04 20:16:03 | Re: Large Result and Memory Limit |
Previous Message | Alvaro Herrera | 2007-10-04 20:05:37 | Re: Large Result and Memory Limit |