From: | Peter Eisentraut <peter_e(at)gmx(dot)net> |
---|---|
To: | mlw <pgsql(at)mohawksoft(dot)com> |
Cc: | swampler(at)noao(dot)edu, "Jason M(dot) Felice" <jfelice(at)cronosys(dot)com>, Postgres-hackers <pgsql-hackers(at)postgresql(dot)org> |
Subject: | Re: PostgreSQL and SOAP, suggestions? |
Date: | 2003-04-02 21:42:55 |
Message-ID: | Pine.LNX.4.44.0304021746180.3656-100000@peter.localdomain |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-hackers |
mlw writes:
> That function looks great, but what happens if you need to return 1
> million records?
The same thing that happens with any set-returning function: memory
exhaustion.
> I have an actual libpq program which performs a query against a server,
> and will stream out the XML, so the number of records has very little
> affect on efficiency. I think the table2xml function is great for 99% of
> all the queries, but for those huge resultsets, I think it may be
> problematic.
>
> What do you think?
Clearly, my approach is not sufficient if you need to handle big result
sets. But perhaps a compromise based on cursors could be designed so that
large parts of the format can be managed centrally. Such as:
DECLARE foo CURSOR FOR SELECT ... ;
-- gives you the XML Schema for the result set
SELECT xmlschema_from_cursor(foo);
-- gives you ones row (<row>...</row>)
SELECT xmldata_from_cursor(foo);
--
Peter Eisentraut peter_e(at)gmx(dot)net
From | Date | Subject | |
---|---|---|---|
Next Message | Hannu Krosing | 2003-04-02 21:51:56 | Re: PostgreSQL and SOAP, suggestions? |
Previous Message | Peter Eisentraut | 2003-04-02 21:40:57 | Re: PostgreSQL and SOAP, suggestions? |