I have experience with LIsten/Notify and as you mention the only problem is that I need a server side client that calls the listen and then calls a db function on the notify.

One thought I had to do this completely in the database is to right notify code in plpython similar to http://postgresql.1045698.n5.nabble.com/LISTEN-NOTIFY-and-python-td1878518.html  Then I could call the function on db startup and have it call the appropriate function whenever notify is called. It would probably be easier to write the function in C, but I don't have experience in copiling a C function that will run in the database.

This would be a completely postgresql solution.

Sim


On 12/11/2012 09:29 PM, rektide wrote:
Hi all, I'm writing seeking help for making asynchronous & decoupled processes run on a
Postgres server.

Here's my current hairbraned workingis:
1. Create an table "async_process" and attach a trigger after.
2. Establish a dblink to localhost.
3. dblink_send_query("update async_process set counter = counter + 1;") from other sprocs
3. Designated processing hanging off this "async_process" table now runs.

All I'm doing is using a table, to create a trigger, that can be run asynchronously.

There's at least two things gross about this strategy:
1. A "async_process" table exists only because I need a trigger that can be updated at will.
2. Having to dblink to oneself to run a query from inside the database asynchronously.

Postgres has a capability for doing async work: NOTIFY/LISTEN. I'd like to verify first,
LISTEN is only for clients, correct? There's no way I can define something resident on
postgres itself that will LISTEN, that can be targetted by notifications?

Does anyone have suggestions for decoupling work done on a server, for breaking up a task
into multiple asychronous pieces? I believe I've described 1. a viable if ugly means of
doing so, and 2. limitations in the primary asynchronous toolsuite of Postgres, and am
looking for ways to make more progress.

Regards,
-rektide