Re: Large Result and Memory Limit

From: André Volpato <andre(dot)volpato(at)ecomtecnologia(dot)com(dot)br>
To: Alvaro Herrera <alvherre(at)commandprompt(dot)com>
Cc: Mike Ginsburg <mginsburg(at)collaborativefusion(dot)com>, pgsql-general(at)postgresql(dot)org
Subject: Re: Large Result and Memory Limit
Date: 2007-10-04 20:16:03
Message-ID: 47054A03.3000106@ecomtecnologia.com.br
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<html>
<head>
<meta content="text/html;charset=ISO-8859-1" http-equiv="Content-Type">
</head>
<body bgcolor="#ffffff" text="#000000">
Alvaro Herrera escreveu:
<blockquote cite="mid:20071004200537(dot)GB28896(at)alvh(dot)no-ip(dot)org" type="cite">
<pre wrap="">Mike Ginsburg wrote:
</pre>
<blockquote type="cite">
<pre wrap="">Hello,
I am working on a personnel registry that has upwards of 50,000
registrants. Currently I am working on an export module that will create a
CSV from multiple tables. I have managed to keep the script (PHP) under
the memory limit when creating and inserting the CSV into the database.
The problem comes when I try to query for the data and export it. Memory
limit is a major concern, but the query for one row returns a result set
too large and PHP fails.
</pre>
</blockquote>
<pre wrap=""><!---->
One row? Wow, I didn't know PHP was that broken.

Try declaring a cursor and fetching a few rows at a time.</pre>
</blockquote>
PHP is just respecting memory_limit when retrieving data. <br>
In this case, a single row is about 30M, a lot more than the limit of
16M.<br>
I think cursors wouldn&acute;t help anyway.<br>
<br>
[]&acute;s,<br>
ACV<br>
<br>
</body>
</html>

Attachment Content-Type Size
unknown_filename text/html 1.2 KB

In response to

Browse pgsql-general by date

  From Date Subject
Next Message Scott Marlowe 2007-10-04 20:19:33 Re: Large Result and Memory Limit
Previous Message André Volpato 2007-10-04 20:12:27 Re: Large Result and Memory Limit