From: | Hannu Krosing <hannu(at)trust(dot)ee> |
---|---|
To: | Matthew Hagerty <matthew(at)venux(dot)net> |
Cc: | pgsql-interfaces(at)postgreSQL(dot)org |
Subject: | Re: [INTERFACES] Dealing with large query results |
Date: | 1999-02-08 21:25:32 |
Message-ID: | 36BF564C.D1A9F8FA@trust.ee |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-interfaces |
Matthew Hagerty wrote:
>
> Greetings,
>
> I keep seeing reference to an 8K limit on query results. So how to deal
> with query results greater than 8K? Am I under the wrong impression that
> when you run a query from, say libpq, that each retch row is returned only
> when you execute a fetchrow? Or is the whole query stuffed in to a buffer
> which has an 8K limit?
the current 8K limit is for queries (and unrelatedly single rows),
not query _results_
> I have to deal with some queries that create a huge result, and/or doing
> reports requires a large result set. How can I deal with these larger results?
If you dont want to digest a huge result, you can do a
DECLARE CURSOR - FETCH -FETCH - FETCH - CLOSE CURSOR thing to get it in
pieces
(needs to be inside BEGIN TRANSACTION - END TRANSACTION)
> I'll probably get flamed for asking this one but, how can I step through a
> table? Hey, sometimes you just have to do it.
See above.
Actually postgres is quite happy to return gigabytes of data, if you
have
the resources (disk space, memory, time ;)
--------------
Hannu
From | Date | Subject | |
---|---|---|---|
Next Message | John Barrett | 1999-02-08 21:27:30 | Re: [INTERFACES] Rapid web based apps? |
Previous Message | Hannu Krosing | 1999-02-08 21:19:03 | Re: [INTERFACES] Rapid web based apps? |