From: | Laurenz Albe <laurenz(dot)albe(at)cybertec(dot)at> |
---|---|
To: | Jan Bilek <jan(dot)bilek(at)eftlab(dot)com(dot)au>, "pgsql-general(at)postgresql(dot)org" <pgsql-general(at)postgresql(dot)org> |
Subject: | Re: ERROR: unsupported Unicode escape sequence - in JSON-type column |
Date: | 2023-02-27 12:13:06 |
Message-ID: | 959c233d610ac3e9ee2c1f7e47ea7775e0d1f1f7.camel@cybertec.at |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On Mon, 2023-02-27 at 06:28 +0000, Jan Bilek wrote:
> Our customer was able to sneak in an Unicode data into a column of a JSON Type and now that record fails on select.
> Would you be able to suggest any way out of this? E.g. finding infringing row, updating its data ... ?
I'd be curious to know how the customer managed to do that.
Perhaps there is a loophole in PostgreSQL that needs to be fixed.
First, find the table that contains the column.
Then you can try something like
DO
$$DECLARE
pkey bigint;
BEGIN
FOR pkey IN SELECT id FROM jsontab LOOP
BEGIN -- starts block with exception handler
PERFORM jsoncol -> 'creationDateTime'
FROM jsontab
WHERE id = pkey;
EXCEPTION
WHEN untranslatable_character THEN
RAISE NOTICE 'bad character in line with id = %', pkey;
END;
END LOOP;
END;$$;
Yours,
Laurenz Albe
--
Cybertec | https://www.cybertec-postgresql.com
From | Date | Subject | |
---|---|---|---|
Next Message | Laurenz Albe | 2023-02-27 12:14:28 | Re: Event Triggers unable to capture the DDL script executed |
Previous Message | celati Laurent | 2023-02-27 11:53:30 | Repear operations on 50 tables of the same schema? |