From: | Andrew Dunstan <andrew(at)dunslane(dot)net> |
---|---|
To: | Joe Conway <mail(at)joeconway(dot)com>, Davin Shearer <davin(at)apache(dot)org>, PostgreSQL-development <pgsql-hackers(at)postgresql(dot)org> |
Subject: | Re: Emitting JSON to file using COPY TO |
Date: | 2023-12-05 21:20:11 |
Message-ID: | 315b81d4-4b67-7828-0355-3808cd14acd1@dunslane.net |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general pgsql-hackers |
On 2023-12-05 Tu 16:09, Joe Conway wrote:
> On 12/5/23 16:02, Joe Conway wrote:
>> On 12/5/23 15:55, Andrew Dunstan wrote:
>>> and in any other case (e.g. LINES) I can't see why you
>>> would have them.
>
> Oh I didn't address this -- I saw examples in the interwebs of MSSQL
> server I think [1] which had the non-array with commas import and
> export style. It was not that tough to support and the code as written
> already does it, so why not?
>
> [1]
> https://learn.microsoft.com/en-us/sql/relational-databases/json/remove-square-brackets-from-json-without-array-wrapper-option?view=sql-server-ver16#example-multiple-row-result
>
>
That seems quite absurd, TBH. I know we've catered for some absurdity in
the CSV code (much of it down to me), so maybe we need to be liberal in
what we accept here too. IMNSHO, we should produce either a single JSON
document (the ARRAY case) or a series of JSON documents, one per row
(the LINES case).
cheers
andrew
--
Andrew Dunstan
EDB: https://www.enterprisedb.com
From | Date | Subject | |
---|---|---|---|
Next Message | Joe Conway | 2023-12-05 21:46:50 | Re: Emitting JSON to file using COPY TO |
Previous Message | Joe Conway | 2023-12-05 21:15:29 | Re: Emitting JSON to file using COPY TO |
From | Date | Subject | |
---|---|---|---|
Next Message | Masahiko Sawada | 2023-12-05 21:33:33 | Re: [PoC] Improve dead tuple storage for lazy vacuum |
Previous Message | Joe Conway | 2023-12-05 21:15:29 | Re: Emitting JSON to file using COPY TO |