From: | Merlin Moncure <mmoncure(at)gmail(dot)com> |
---|---|
To: | Mitar <mmitar(at)gmail(dot)com> |
Cc: | f(dot)venchiarutti(at)ocado(dot)com, Dave Cramer <pg(at)fastcrypt(dot)com>, "pgsql-generallists(dot)postgresql(dot)org" <pgsql-general(at)lists(dot)postgresql(dot)org> |
Subject: | Re: Automatically parsing in-line composite types |
Date: | 2019-10-30 15:38:10 |
Message-ID: | CAHyXU0yo9HKG4r+aS9bhmngPoUmZtQpvcSu42OK4rukqhHs2dQ@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On Tue, Oct 29, 2019 at 12:58 PM Mitar <mmitar(at)gmail(dot)com> wrote:
>
> Hi!
>
> On Tue, Oct 29, 2019 at 9:06 AM Fabio Ugo Venchiarutti
> <f(dot)venchiarutti(at)ocado(dot)com> wrote:
> > You can use subqueries and array_agg() to deepen your output tree all
> > the way to a stack overflow, a single <whatever>_to_json() call at the
> > top will recursively traverse and convert whatever you feed it.
>
> Yes, what you are describing is exactly the sad state of things: the
> only way to meaningfully retrieve inline composite types which are
> made when one aggregate things like that, or when you subselect a set
> of fields from a table in a sub-query, is that you then convert the
> whole thing to JSON and transmit it in that way. Because this is the
> only way you can parse things on the client. Because if you leave it
> as raw composite type encoding, you cannot really parse that on the
> client correctly in all cases without knowing what types are stored
> inside those composite types you are getting.
>
> But JSON is not a lossless transport format: it does not support full
> floating point spec (no inf, NANs) and for many types of fields it
> just converts to string representation of that, which can be
> problematic. For example, if you have binary blobs.
>
> So no, JSON is a workaround, but it is sad that we should have to use
> it. PostgreSQL seems to be almost there with the support for composite
> types and nested query results, only it seems you cannot really parse
> it out. I mean, why PostgreSQL even has its own binary format for
> results, then it could just transmit everything as JSON. :-) But that
> does not really work for many data types.
>
> I think RowDescription should be extended to provide full recursive
> metadata about all data types. That would be the best way to do it.
Check out libpqtypes: https://github.com/pgagarinov/libpqtypes
it does exactly what you want. It's a wrapper for libpq that provides
client side parsing for the binary protocol with array and composite
type parsing. This library was written before the json train left the
station; it's only use today (assuming you don't like the cute
'PQexedf') would be scenarios where performance requirements are
extreme and the client application is written in C (so you can benefit
from direct binary copying vs string serialization). Virtually any
non-C client application really ought to be using json rather than the
custom binary structures libpqtyps would provide. I cowrite the
library, it works wondefully, but I've since moved on to json.
JSON also undercuts the argument that the database ought to provide
deep type facilities on both sides of the protocol IMO; I used to be
interested in that, now I'm not; the world is moving on.
merlin
From | Date | Subject | |
---|---|---|---|
Next Message | Mitar | 2019-10-30 16:15:22 | Re: Automatically parsing in-line composite types |
Previous Message | Ritesh Jha | 2019-10-30 12:29:08 | Re: Restore single table |