From: | André Volpato <andre(dot)volpato(at)ecomtecnologia(dot)com(dot)br> |
---|---|
To: | Mike Ginsburg <mginsburg(at)collaborativefusion(dot)com> |
Cc: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: Large Result and Memory Limit |
Date: | 2007-10-04 19:49:36 |
Message-ID: | 470543D0.2090607@ecomtecnologia.com.br |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Mike Ginsburg escreveu:
> Hello,
> I am working on a personnel registry that has upwards of 50,000
> registrants. Currently I am working on an export module that will
> create a CSV from multiple tables. I have managed to keep the script
> (PHP) under the memory limit
okay... some info needed here.
1. memory on the DB server
2. memory_limit on php.ini
> when creating and inserting the CSV into the database. The problem
> comes when I try to query for the data and export it. Memory limit is
> a major concern, but the query for one row returns a result set too
> large and PHP fails.
a single row is enough to crash PHP ?
>
> I've thought about storing the data in multiple rows and then querying
> one-by-one and outputting, but was hoping there was a better way.
if you can´t raise memory_limit, I think it´s the only way.
[]´s
ACV
From | Date | Subject | |
---|---|---|---|
Next Message | Ted Byers | 2007-10-04 19:53:20 | Re: Design Question (Time Series Data) |
Previous Message | Mike Ginsburg | 2007-10-04 19:26:41 | Large Result and Memory Limit |