From: | Merlin Moncure <mmoncure(at)gmail(dot)com> |
---|---|
To: | Hannu Krosing <hannu(at)2ndquadrant(dot)com> |
Cc: | Andrew Dunstan <andrew(at)dunslane(dot)net>, Abhijit Menon-Sen <ams(at)toroid(dot)org>, pgsql-hackers(at)postgresql(dot)org, hannu(at)krosing(dot)net |
Subject: | Re: JSON for PG 9.2 |
Date: | 2012-04-16 16:41:30 |
Message-ID: | CAHyXU0yoH=6xj58CRN2_TGkaP-fBiYksjvDaPhnESCNz2JdQkA@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-hackers |
On Mon, Apr 16, 2012 at 11:19 AM, Hannu Krosing <hannu(at)2ndquadrant(dot)com> wrote:
> If doing something in 9.3 then what I would like is some way to express
> multiple queries. Basically a variant of
>
> query_to_json(query text[])
>
> where queries would be evaluated in order and then all the results
> aggregated into on json object.
I personally don't like variants of to_json that push the query in as
text. They defeat parameterization and have other issues. Another
point for client side processing is the new row level processing in
libpq, so I'd argue that if the result is big enough to warrant
worring about buffering (and it'd have to be a mighty big json doc),
the best bet is to extract it as rows. I'm playing around with
node.js for the json serving and the sending code looks like this:
var first = true;
query.on('row', function(row) {
if(first) {
first = false;
response.write('[');
}
else response.write(',');
response.write(row.jsondata);
});
query.on('end', function() {
response.write(']');
response.end();
});
-- not too bad
merlin
From | Date | Subject | |
---|---|---|---|
Next Message | Alvaro Herrera | 2012-04-16 16:45:01 | Re: how to create a non-inherited CHECK constraint in CREATE TABLE |
Previous Message | Magnus Hagander | 2012-04-16 16:29:47 | Re: Bug tracker tool we need (was: Last gasp) |