From: | Brian Hirt <bhirt(at)mobygames(dot)com> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | locking question |
Date: | 2004-04-27 23:17:18 |
Message-ID: | 0740FC47-98A1-11D8-BDF4-000393D9FD00@mobygames.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
I have a question about locks.
I have a stats table that get updated when some other table changes.
Sometimes that other table is updated a 2nd time before the first stats
update is finished which causes an error. I've tried using 'SET
TRANSACTION ISOLATION LEVEL SERIALIZABLE' but get 'could not serialize
access due to concurrent update' If i try 'READ COMMITED' i get
primary key failures. This seems like it's a pretty common thing, and
I'l like to be able to do this without having to write code to check
for the 'could not serialize due to concurrent update' error and
re-run the query.
I don't have much experience with locking, because I haven't really
needed to use it. Any advice would be greatly helpful. Belew is
basically the transaction I'm running -- it fails when a 2nd one starts
while the 1st is still running.
BEGIN WORK
delete from blah_stats where id = 1
insert into blah_stats select id,count(*) from blah where id = 1 group
by id
COMMIT WORK
Regards,
Brian Hirt
From | Date | Subject | |
---|---|---|---|
Next Message | Tom Lane | 2004-04-28 02:07:00 | Re: linked list rewrite |
Previous Message | Development - multi.art.studio | 2004-04-27 23:10:40 | Re: BLOB help needed... |