From: | Mitar <mmitar(at)gmail(dot)com> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | Benchmark of using JSON to transport query results in node.js |
Date: | 2019-01-10 21:51:53 |
Message-ID: | CAKLmikO1y8LD+8Ugd+cwUW0JWj36tmPfAfn+M33d1G2MHhiOOA@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Hi!
I made some benchmarks of using JSON to transport results to node.js
and it seems it really makes a difference over using native or
standard PostgreSQL. So the idea is that you simply wrap all results
into JSON like SELECT to_json(t) FROM (... original query ...) AS t. I
am guessing because node.js/JavaScript has really fast JSON parser but
for everything else there is overhead. See my blog post for more
details [1]. Any feedback welcome.
This makes me wonder. If serialization/deserialization makes such big
impact, where there efforts to improve how results are serialized for
over-the-wire transmission? For example, to use something like
Capnproto [2] to serialize into structure which can be directly used
without any real deserialization?
[1] https://mitar.tnode.com/post/181893159351/in-nodejs-always-query-in-json-from-postgresql
[2] https://capnproto.org/
Mitar
From | Date | Subject | |
---|---|---|---|
Next Message | Tony Shelver | 2019-01-11 11:06:01 | Re: Benchmark of using JSON to transport query results in node.js |
Previous Message | Tom Lane | 2019-01-10 16:04:23 | Re: log level of "drop cascade" lists |