| From: | "Mark Woodward" <pgsql(at)mohawksoft(dot)com> |
|---|---|
| To: | "Chris Mair" <chrisnospam(at)1006(dot)org> |
| Cc: | pgsql-hackers(at)postgresql(dot)org |
| Subject: | Re: Query Failed, out of memory |
| Date: | 2006-10-05 16:55:51 |
| Message-ID: | 18128.24.91.171.78.1160067351.squirrel@mail.mohawksoft.com |
| Views: | Whole Thread | Raw Message | Download mbox | Resend email |
| Thread: | |
| Lists: | pgsql-hackers |
>
>> > FWIW, there's a feature in CVS HEAD to instruct psql to try to use a
>> > cursor to break up huge query results like this. For the moment I'd
>> > suggest using COPY instead.
>>
>>
>> That's sort of what I was afraid off. I am trying to get 100 million
>> records into a text file in a specific order.
>>
>> Sigh, I have to write a quick program to use a cursor. :-(
>
> Why don't you try the psql client from 8.2beta1 then? This way you don't
> have to write the program yourself and you're helping out with beta
> testing as well :-)
> See FETCH_COUNT in
> http://developer.postgresql.org/pgdocs/postgres/app-psql.html
>
Well, maybe next time, it only took about 10 minutes to write. It is a
simple program.
| From | Date | Subject | |
|---|---|---|---|
| Next Message | Mark Woodward | 2006-10-05 17:03:19 | Re: Query Failed, out of memory |
| Previous Message | Luke Lonergan | 2006-10-05 16:52:33 | Re: Query Failed, out of memory |