From: | Adrian Klaver <adrian(dot)klaver(at)aklaver(dot)com> |
---|---|
To: | Michael Nolan <htfoot(at)gmail(dot)com>, "David G(dot) Johnston" <david(dot)g(dot)johnston(at)gmail(dot)com> |
Cc: | "pgsql-general(at)postgresql(dot)org" <pgsql-general(at)postgresql(dot)org> |
Subject: | Re: Bypassing NULL elements in row_to_json function |
Date: | 2016-04-10 15:23:40 |
Message-ID: | 570A6FFC.7090600@aklaver.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On 04/10/2016 07:49 AM, Michael Nolan wrote:
>
>
> On Sun, Apr 10, 2016 at 2:30 AM, David G. Johnston
> <david(dot)g(dot)johnston(at)gmail(dot)com <mailto:david(dot)g(dot)johnston(at)gmail(dot)com>> wrote:
>
> On Sat, Apr 9, 2016 at 9:48 PM, Michael Nolan <htfoot(at)gmail(dot)com
> <mailto:htfoot(at)gmail(dot)com>>wrote:
>
>
> 2nd Followup: It turns out that loading a table from a JSON
> string is more complicated than going from a table to JSON,
> perhaps for good reason. There does not appear to be a direct
> inverse to the row_to_json() function, but it wasn't difficult
> for me to write a PHP program that takes the JSON file I created
> the other day and converts it back to a series of inserts,
> recreating the original table.
>
> Of course this simple program does NO validation (not that this
> file needed any), so if the JSON string is not well-formed for
> any of a number of reasons, or if it is not properly mapped to
> the table into which the inserts are made, an insert could fail
> or result in incorrect data.
> --
> Mike Nolan
>
>
> See: http://www.postgresql.org/docs/9.5/interactive/functions-json.html
>
> json_populate_record(base anyelement, from_json json)
> json_populate_recordset(base anyelement, from_json json)
>
> Exists in 9.3 too...though if you are going heavy json I'd suggest
> doing whatever you can to keep up with the recent releases.
>
> David J.
>
>
> If there's a way to use the json_populate_record() or
> json_populate_recordset() functions to load a table from a JSON file
> (eg, using copy), it would be nice if it was better documented. I did
> find a tool that loads a JSON file into a table (pgfutter), and even
> loaded one row from that table into another table using
> json_populate_record(), but the 'subquery returned multiple rows' issue
> wouldn't let me do the entire table.
Does the receiving table have the same structure as the sending table?
Is the receiving table already populated with data?
>
> But that still doesn't deal with validating individual fields or
> checking that the JSON is complete and consistent with the table to be
> loaded.
Well you know the JSON is not complete as you dropped all the fields in
each row that had NULL values.
Validation is a more complex subject and honestly something I do not
think could be accomplished in straight SQL. In other words it would
need to be run through some ETL tool. I use Python so as an example:
https://petl.readthedocs.org/en/latest/
In particular:
https://petl.readthedocs.org/en/latest/io.html#json-files
https://petl.readthedocs.org/en/latest/transform.html#validation
> --
> Mike Nolan
--
Adrian Klaver
adrian(dot)klaver(at)aklaver(dot)com
From | Date | Subject | |
---|---|---|---|
Next Message | David G. Johnston | 2016-04-10 15:29:08 | Re: Really unique session ID - PID + connection timestamp? |
Previous Message | David G. Johnston | 2016-04-10 15:14:16 | Re: Bypassing NULL elements in row_to_json function |