From: | K D <keithdutton(at)gmail(dot)com> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | plpython large result set |
Date: | 2009-03-02 16:10:20 |
Message-ID: | 7c69a9640903020810u27d7bb4do88a1bea49570bf9@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Hello,
I am hoping to use plpython to perform various transforms on query results
of very large size.
The documentation in the official 8.3 manual makes it appear as if the
results of plpy.execute are read in at once (e.g., they appear to have
random access and are mutable) rather than in the hidden cursor fashion of
looping through a PgSql query result set. If this correct? If so does it
mean that I need to avoid plpy.execute for very large queries? If so, a
cursor/generator interface would seem to be a substantial improvement for
the future.
If I cannot use plpy.execute, is there some way to declare and use a
standard cursor from within plpython? I can find nothing on this on the
web, and my own experimentation has been fruitless. Any quick example would
be hugely appreciated.
If none of the above works, my fallback will be to execute the query in
PgSql, then within the fetch loop call a plpython procedure. This works as
far as my testing has gone, but seems unfortunate.
Thanks,
Kevin
From | Date | Subject | |
---|---|---|---|
Next Message | Chris Browne | 2009-03-02 16:35:33 | Non-Fun with SSHA Password scheme |
Previous Message | Hiroshi Saito | 2009-03-02 15:35:37 | Re: encoding of PostgreSQL messages |