From: | Timothy Garnett <tgarnett(at)panjiva(dot)com> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | Removing null bytes from a json column |
Date: | 2017-06-08 16:15:07 |
Message-ID: | CAPcyiQ3fSB9T2q+OvyF_rmi8WAD9p1FdDMTw5izaQLE1qjUTYw@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Does anyone have some tips on how to deal with an existing json type column
that has some null bytes ( \u0000) in it? It seems like anything I do that
touches any row with a null byte just errors. I'd love to just remove them
if I could find some way to find them, but I'm having trouble even figuring
out how to do that.
This error I get is always:
PG::UntranslatableCharacter: ERROR: unsupported Unicode escape sequence
DETAIL: \u0000 cannot be converted to text.
CONTEXT: JSON data, line 1:
...st_name":"efxkerbs","company":"efxkerbs","email":...
Convert the column to jsonb -> errors.
Trying to find one of the offending rows:
select id from xxx where command->'args'->2->>'company' = 'efxkerbs';
- same error on \u0000
select id from issued_crm_commands where command->'args'->2->'company' =
'"efxkerbs"':json;
- no equality operator between json
select id from issued_crm_commands where
command->'args'->2->'company'::bytea = 'efxkerbs'::bytea;
- no conversion from json to bytea
Any ideas on how to find rows with a \u0000 in the json?
Tim
From | Date | Subject | |
---|---|---|---|
Next Message | Adrian Klaver | 2017-06-08 16:18:42 | Re: Performance issue with Pointcloud extension |
Previous Message | Bill Moran | 2017-06-08 16:13:51 | Re: Performance issue with Pointcloud extension |