From: | Gerardo Perosio <gperosio(at)dd(dot)com(dot)ar> |
---|---|
To: | pgsql-sql(at)postgresql(dot)org |
Subject: | SELECT FOR UPDATE CLAUSE |
Date: | 2001-10-10 13:06:01 |
Message-ID: | 1002719161.525.45.camel@io.feedback.net.ar |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-sql |
Hi all.
I have a problem with an application written in php.
I need to lock a record for update, and I use the following transaction:
BEGIN WORK;
//row1 is an indexed field
SELECT row1, row2 from table1 where row3 = NULL order by row1 FOR UPDATE
limit 1;
//fetch row1 in $myvar
UPDATE table1 set row3 = now() where row1 = $myvar;
...
COMMIT WORK;
When I have more than one transaction (multiuser environment), the
record obtained is a null row for the rest of transaction, this is
logic, because now this record there isn't row3 = NULL . My idea is
fetch any row than row3 = NULL.
I think that, If I fetch 64 rows whit limit 64 (max number of concurrent
transactions in postgresql.conf) and iterate throw this, or, loop until
the fetch for $myvar not equal null.
Any other idea.
Thanks in advance.
--
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
"Despues de tres dias sin programar,
la vida se torna sin sentido."
-- Geoffrey James, "The Tao of Programming"
Gerardo Perosio
Operaciones - Desarrollos Digitales
http://www.dd.com.ar
(+54-11)6667-5700
Talcahuano 446 7* - Bs.As. Argentina
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
From | Date | Subject | |
---|---|---|---|
Next Message | Aasmund Midttun Godal | 2001-10-10 19:23:22 | Restricting access to Large objects |
Previous Message | Stuart Grimshaw | 2001-10-10 12:35:27 | Why would this slow the query down so much? |