From: | "Scott Marlowe" <scott(dot)marlowe(at)gmail(dot)com> |
---|---|
To: | "Mike Ginsburg" <mginsburg(at)collaborativefusion(dot)com> |
Cc: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: Large Result and Memory Limit |
Date: | 2007-10-04 20:47:27 |
Message-ID: | dcc563d10710041347y4a38c943g58dea986d61917e5@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On 10/4/07, Mike Ginsburg <mginsburg(at)collaborativefusion(dot)com> wrote:
> This is for the export only. Since it is an export of ~50,000 registrants,
> it takes some time to process. We also have load balanced web servers, so
> unless I want to create identical processes on all webservers, or write some
> crazy script to scp it across the board, storing it as a text file is not an
> option. I realize that my way of doing it is flawed, which the reason I
> came here for advice. The CSV contains data from approximately 15 tables,
> several of which are many-to-ones making joins a little tricky. My thought
> was to do all of the processing in the background, store the results in the
> DB, and allowing the requester to download it at their convenience.
>
> Would it be a good idea to create a temporary table that stored all of the
> export data in it broken out by rows and columns, and when download time
> comes, query from their?
Yeah, I tend to think that would be better. Then you could use a
cursor to retrieve then and serve them one line at a time and not have
to worry about overloading your php server.
From | Date | Subject | |
---|---|---|---|
Next Message | Bill Moran | 2007-10-04 21:03:48 | Re: Large Result and Memory Limit |
Previous Message | Mike Ginsburg | 2007-10-04 20:44:35 | Re: Large Result and Memory Limit |