Re: max_stack_depth problem though query is substantially smaller

From: "Bannert Matthias" <bannert(at)kof(dot)ethz(dot)ch>
To: Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us>, Charles Clavadetscher <clavadetscher(at)swisspug(dot)org>
Cc: "pgsql-general(at)postgresql(dot)org" <pgsql-general(at)postgresql(dot)org>
Subject: Re: max_stack_depth problem though query is substantially smaller
Date: 2016-04-08 16:31:46
Message-ID: 8586FCA42D306C4DB0BD46EF9F1B58025AFA61B7@MBX110.d.ethz.ch
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

Thanks for your reply. I do think it is rather a postgres than an R issue, here's why:

a) R simply puts an SQL string together. What Charles had posted was an excerpt of that string.
Basically we have 1.7 MB of that string. Everything else is equal just the hstore contains 40K key value pairs.

b) The error message clearly mentions max_stack_depth which is a postgres parameter.

c) If I just take that SQL string (only the first part of it, i.e. the create temp table and insert into part w/o all the
update and join gibberish and put it to a .sql file and simply run it through a psql client like this:
\i myquery.sql

I get exactly the same error message (without any R involved at any stage)

psql:query.sql:3: ERROR: stack depth limit exceeded
HINT: Increase the configuration parameter "max_stack_depth" (currently 7168kB), after ensuring the platform's stack depth limit is adequate.

d) I ran into to quite some R stack errors and they look different... (C_STACK_SIZE)

conclusion:
We are running a simple insert. Nothing special except for the fact that hstore has 40K key value pairs. Could it be that the indexing of that hstore gets kinda large
and thus a query string that only has 1.7 MB exceeds the stack ?

________________________________________
From: Tom Lane [tgl(at)sss(dot)pgh(dot)pa(dot)us]
Sent: Friday, April 08, 2016 4:20 PM
To: Charles Clavadetscher
Cc: pgsql-general(at)postgresql(dot)org; Bannert Matthias
Subject: Re: [GENERAL] max_stack_depth problem though query is substantially smaller

"Charles Clavadetscher" <clavadetscher(at)swisspug(dot)org> writes:
> When R processes the daily time serie we get a stack size exceeded
error, followed by the hint to increase the max_stack_depth.

Postgres doesn't generally allocate large values on the stack, and I doubt
that R does either. Almost certainly, what is causing this is not data
size per se but unreasonable call nesting depth in your R code. You may
have a function that's actually in infinite recursion, or maybe it's
recursing to a depth governed by the number of data elements. If so,
consider revising it into iteration with an explicitly-represented state
stack.

regards, tom lane

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message David G. Johnston 2016-04-08 16:44:29 Re: Bypassing NULL elements in row_to_json function
Previous Message Raymond O'Donnell 2016-04-08 15:53:46 Re: Bypassing NULL elements in row_to_json function