From: | Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us> |
---|---|
To: | Robert Haas <robertmhaas(at)gmail(dot)com> |
Cc: | Joey Adams <joeyadams3(dot)14159(at)gmail(dot)com>, Bernd Helmle <mailings(at)oopsware(dot)de>, Dimitri Fontaine <dimitri(at)2ndquadrant(dot)fr>, David Fetter <david(at)fetter(dot)org>, Josh Berkus <josh(at)agliodbs(dot)com>, pgsql-hackers(at)postgresql(dot)org |
Subject: | Re: Initial Review: JSON contrib modul was: Re: Another swing at JSON |
Date: | 2011-07-18 19:19:11 |
Message-ID: | 4091.1311016751@sss.pgh.pa.us |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-hackers |
Robert Haas <robertmhaas(at)gmail(dot)com> writes:
> On Fri, Jul 15, 2011 at 3:56 PM, Joey Adams <joeyadams3(dot)14159(at)gmail(dot)com> wrote:
>> I'm having a really hard time figuring out how the JSON module should
>> handle non-Unicode character sets.
> But, again, why not just forget about transcoding and define it as
> "JSON, if you happen to be using utf-8 as the server encoding, and
> otherwise some variant of JSON that uses the server encoding as its
> native format?". It seems to me that that would be a heck of a lot
> simpler and more reliable, and I'm not sure it's any less useful in
> practice.
Right offhand, the only argument I can see against this is that we might
someday want to pass JSON datums directly to third-party loadable
libraries that are picky about UTF8-ness. But we could just document
and/or enforce that such libraries can only be used in UTF8-encoded
databases.
BTW, could the \uNNNN problem be finessed by leaving such escapes in
source form?
regards, tom lane
From | Date | Subject | |
---|---|---|---|
Next Message | Tom Lane | 2011-07-18 19:20:56 | Re: patch for 9.2: enhanced errors |
Previous Message | Tom Lane | 2011-07-18 19:12:20 | Re: Reduced power consumption in autovacuum launcher process |