From: | Adrian Klaver <adrian(dot)klaver(at)aklaver(dot)com> |
---|---|
To: | armand pirvu <armand(dot)pirvu(at)gmail(dot)com> |
Cc: | "pgsql-general(at)postgresql(dot)org" <pgsql-general(at)postgresql(dot)org> |
Subject: | Re: cursors and function question |
Date: | 2018-02-13 22:31:51 |
Message-ID: | b0e739d5-50ce-507c-995b-49f27981b6fa@aklaver.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On 02/13/2018 01:25 PM, armand pirvu wrote:
>
>> On Feb 13, 2018, at 1:22 PM, Adrian Klaver <adrian(dot)klaver(at)aklaver(dot)com
>> <mailto:adrian(dot)klaver(at)aklaver(dot)com>> wrote:
>>
>
> Not a trigger , but the idea is we will do some batch processing from
> said table let’s name it testtbl
>
> 1 - we get the records using select for update with a limit 100 for example
> 2 - update each record using using cursor
> 3 - print the cursor content so that way I have an idea what was updated
>
> I was thinking that if I can put a unique constraint on the table, I can
> generate a global table in the function , update main table from global
> table and return select from global table
Not entirely sure I know what you are trying to accomplish, still:
1) Not sure you need to use cursor, see here for less complicated way:
https://www.postgresql.org/docs/10/static/plpgsql-control-structures.html#PLPGSQL-RECORDS-ITERATING
and
Using RETURN NEXT.
Keeping mind:
https://www.postgresql.org/docs/10/static/plpgsql-cursors.html
"Rather than executing a whole query at once, it is possible to set up a
cursor that encapsulates the query, and then read the query result a few
rows at a time. One reason for doing this is to avoid memory overrun
when the result contains a large number of rows. (However, PL/pgSQL
users do not normally need to worry about that, since FOR loops
automatically use a cursor internally to avoid memory problems.) A more
interesting usage is to return a reference to a cursor that a function
has created, allowing the caller to read the rows. This provides an
efficient way to return large row sets from functions."
So if you are keeping the rows to 100 a FOR loop would seem to suffice.
2) By global table do you mean a temporary table? If so not sure that is
going to work as I am pretty sure it will disappear after the function
is run. I could see having a permanent table that you INSERT the updated
rows into with a timestamp. Then you could update the main table from
that and prune old records using the timestamps.
>
> I can see the developer desire to use cursors to minimize some effort on
> his side
>
> Thanks
>
> Armand
>
>
--
Adrian Klaver
adrian(dot)klaver(at)aklaver(dot)com
From | Date | Subject | |
---|---|---|---|
Next Message | David G. Johnston | 2018-02-13 22:37:27 | Re: cursors and function question |
Previous Message | Thiagarajan Lakshminarayanan | 2018-02-13 22:27:06 | PostgreSQL Download |