From: | rob stone <floriparob(at)gmail(dot)com> |
---|---|
To: | Yuri Budilov <yuri(dot)budilov(at)hotmail(dot)com>, John R Pierce <pierce(at)hogranch(dot)com>, "pgsql-general(at)lists(dot)postgresql(dot)org" <pgsql-general(at)lists(dot)postgresql(dot)org> |
Subject: | Re: JSON out of memory error on PostgreSQL 9.6.x |
Date: | 2017-12-04 00:01:17 |
Message-ID: | 1512345677.5030.1.camel@gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On Sun, 2017-12-03 at 23:18 +0000, Yuri Budilov wrote:
> Posted on Stack Overflow, sadly no replies, so trying here....
>
> CREATE TABLE X AS
> SELECT json_array_elements(json_rmq -> 'orders'::text) AS order
> FROM table_name
> WHERE blah;
> I get out of memory error.
>
> Is there anything I can do to unpack the above?
>
> The JSON column is about ~5 MB and it has about ~150,000 array
> row elements in 'orders' above.
>
> I tried work_mem values up to ~250MB and it did not help, the query
> takes about same time to fail.
>
> I guess this parameter does not help JSON processing.
>
> If there another parameter I can try? Something else?
>
> I don't have control of the size of the JSON payload, it arrives, we
> store it in a JSON column and then we need to crack it open.
>
> Many thanks!
>
Hello,
It would help if you advised:-
(a) version of PostgreSql being used.
(b) is column json_rmq defined as json or jsonb?
(c) OS.
Cheers,
Rob
From | Date | Subject | |
---|---|---|---|
Next Message | Peter J. Holzer | 2017-12-04 00:01:58 | Re: large numbers of inserts out of memory strategy |
Previous Message | Tom Lane | 2017-12-04 00:01:09 | Re: JSON out of memory error on PostgreSQL 9.6.x |