From: | Mike Ginsburg <mginsburg(at)collaborativefusion(dot)com> |
---|---|
To: | André Volpato <andre(dot)volpato(at)ecomtecnologia(dot)com(dot)br> |
Cc: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: Large Result and Memory Limit |
Date: | 2007-10-04 19:57:42 |
Message-ID: | 470545B6.5050301@collaborativefusion.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
André Volpato wrote:
> Mike Ginsburg escreveu:
>> Hello,
>> I am working on a personnel registry that has upwards of 50,000
>> registrants. Currently I am working on an export module that will
>> create a CSV from multiple tables. I have managed to keep the script
>> (PHP) under the memory limit
> okay... some info needed here.
> 1. memory on the DB server
> 2. memory_limit on php.ini
PHP Memory Limit is 16M. We're running multiple installations on a
single webserver, so memory is a concern
DB Server is separate from the webserver.
>> when creating and inserting the CSV into the database. The problem
>> comes when I try to query for the data and export it. Memory limit
>> is a major concern, but the query for one row returns a result set
>> too large and PHP fails.
> a single row is enough to crash PHP ?
Well the "data" field in the table (text) contains 50K lines. It's over
30M in size for the full export.
>
>>
>> I've thought about storing the data in multiple rows and then
>> querying one-by-one and outputting, but was hoping there was a better
>> way.
> if you can´t raise memory_limit, I think it´s the only way.
I was afraid that would be the answer.
>
> []´s
> ACV
>
>
>
>
>
>
>
Mike Ginsburg
Collaborative Fusion, Inc.
mginsburg(at)collaborativefusion(dot)com
412-422-3463 x4015
From | Date | Subject | |
---|---|---|---|
Next Message | Alvaro Herrera | 2007-10-04 20:05:37 | Re: Large Result and Memory Limit |
Previous Message | Ted Byers | 2007-10-04 19:53:20 | Re: Design Question (Time Series Data) |