From: | "Rhys A(dot)D(dot) Stewart" <rhys(dot)stewart(at)gmail(dot)com> |
---|---|
To: | jian he <jian(dot)universality(at)gmail(dot)com> |
Cc: | pgsql-general(at)lists(dot)postgresql(dot)org |
Subject: | Re: Dynamically accessing columns from a row type in a trigger |
Date: | 2023-08-15 00:20:34 |
Message-ID: | CACg0vT=ELyfN5NEdp001TkJQxKyoxAAyUPwMmY_d+xYBHYMRtg@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Hello again
> > Actually, now that I'm thinking about it, I don't really want to store
> > the value into a variable because the pk_col might be of any given
> > type. So ideally, I'd love a way to just get the value from OLD and
> > use it directly in another query. Something along the lines of:
> >
> > `EXECUTE format('SELECT * FROM %1$I.sometable WHERE pk = $1', myschma)
> > USING OLD['pk_col']`.
> >
> > I reckon I may have to look at just generating a trigger function per
> > table, or maybe look into using TG_ARGS.
So the less obvious solution that works is to create a temporary
table. A little verbose, but I get to keep the types.
`CREATE TEMPORARY TABLE _ ON COMMIT DROP AS SELECT OLD.*;`
_ as a table name makes things a little easier to type.
Rhys
Peace & Love | Live Long & Prosper
From | Date | Subject | |
---|---|---|---|
Next Message | Rob Sargent | 2023-08-15 00:49:14 | Re: Dynamically accessing columns from a row type in a trigger |
Previous Message | Ron | 2023-08-14 19:14:18 | Re: Fatal Error : Invalid Memory alloc request size 1236252631 |