From: | Chris Withers <chris(at)simplistix(dot)co(dot)uk> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | using a postgres table as a multi-writer multi-updater queue |
Date: | 2015-11-23 10:41:06 |
Message-ID: | 5652ED42.7020005@simplistix.co.uk |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Hi All,
I wondered if any of you could recommend best practices for using a
postgres table as a queue. Roughly speaking, 100-200 workers will vomit
rows and rates of a few hundres per second into the table leaving the
status as new and then as many workers as needed to keep up with the
load will plough through the queue changing the status to something
other than new.
My naive implementation would be something along the lines of:
CREATE TABLE event (
ts timestamp,
event char(40),
status char(10),
CONSTRAINT pkey PRIMARY KEY(ts, event)
);
...with writers doing INSERT or COPY to get data into the table and
readers doing something like:
SELECT FOR UPDATE * FROM event WHERE status='new' LIMIT 1000;
...so, grabbing batches of 1,000, working on them and then setting their
status.
But, am I correct in thinking that SELECT FOR UPDATE will not prevent
multiple workers selecting the same rows?
Anyway, is this approach reasonable? If so, what tweaks/optimisations
should I be looking to make?
If it's totally wrong, how should I be looking to approach the problem?
cheers,
Chris
From | Date | Subject | |
---|---|---|---|
Next Message | Chris Withers | 2015-11-23 10:42:04 | Re: current_query='IDLE" in pg_stat_activity |
Previous Message | paramjib baruah | 2015-11-23 10:33:41 | current_query='IDLE" in pg_stat_activity |