From: | Bo Lorentsen <bl(at)netgroup(dot)dk> |
---|---|
To: | pgsql-sql <pgsql-sql(at)postgresql(dot)org> |
Subject: | Large data sets and FOR LOOP's |
Date: | 2002-03-13 09:37:21 |
Message-ID: | 1016012241.1094.90.camel@netgroup |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-sql |
Hi ...
I have been working on some server side scripts in Plpgsql and I'm quite
happy about them (nice and fast way to work with large data sets) :-)
But, if I, in my code, make "for loop" on a huge select result (about
1.7 mil. rows), my function begin to consume alot of memory. The code
looks like this :
FOR trec IN SELECT f1,f2 FROM large_tabel LOOP
...
END LOOP;
When selecting the result with a small limit (about 1000) the function
still consumes much memory, but when the function is successfully done
the memory are freed nicely again. When using the full select I run out
of memory (I got 2G in this mashine + swap :-)).
Are there a way to release this memory, or would a cursor be the answer.
Does Plpgsql not use normal scope rules in FOR LOOP's ?
Anyway, I fixed the problem by removing the topmost FOR loop, and then
called the function from a SELECT, multibly times. Like this :
SELECT myfunc( f1, f2 ) FROM large_table;
Is this a known issue, or is it at bug, or are there something I have
misunderstood ?
Who maintain the part of PostgreSQL, anyway ?
/BL
From | Date | Subject | |
---|---|---|---|
Next Message | Andre Schubert | 2002-03-13 10:53:29 | REFERENCES |
Previous Message | Stephan Szabo | 2002-03-13 06:57:03 | Re: [INIMSS] How to foreign key |