| From: | Mike Ginsburg <mginsburg(at)collaborativefusion(dot)com> |
|---|---|
| To: | pgsql-general(at)postgresql(dot)org |
| Subject: | Large Result and Memory Limit |
| Date: | 2007-10-04 19:26:41 |
| Message-ID: | 47053E71.2060705@collaborativefusion.com |
| Views: | Whole Thread | Raw Message | Download mbox | Resend email |
| Thread: | |
| Lists: | pgsql-general |
Hello,
I am working on a personnel registry that has upwards of 50,000
registrants. Currently I am working on an export module that will
create a CSV from multiple tables. I have managed to keep the script
(PHP) under the memory limit when creating and inserting the CSV into
the database. The problem comes when I try to query for the data and
export it. Memory limit is a major concern, but the query for one row
returns a result set too large and PHP fails.
I've thought about storing the data in multiple rows and then querying
one-by-one and outputting, but was hoping there was a better way.
Thanks in advance for the help.
MG
Mike Ginsburg
Collaborative Fusion, Inc.
mginsburg(at)collaborativefusion(dot)com
412-422-3463 x4015
| From | Date | Subject | |
|---|---|---|---|
| Next Message | André Volpato | 2007-10-04 19:49:36 | Re: Large Result and Memory Limit |
| Previous Message | Jan Theodore Galkowski | 2007-10-04 18:46:29 | time penalties on triggers? |