From: | Albert Cervera Areny <albertca(at)jazzfree(dot)com> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | Fwd: Stock update like application |
Date: | 2003-01-06 03:40:31 |
Message-ID: | 200301060440.31630.albertca@jazzfree.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
This message was sended to pgsql-performance first but I can't see much
activity there, so I'm trying it here :-) Sorry for those in both lists !
Hi!
I'm developing a small application which I'd like to be as fast as
possible. The program simply receives an order by modem and has to give an
answer with the products my enterprise will be able to send them. The number
of products could be as much as 300-400 and I don't want to make my clients
put I high time-out before the answer is sended.
I do also need to use transactions as I start calculating before the whole
order has been received and if an error occurs everything has to be rolled
back.
Under this circumstances which way do you think it would be faster?
- Make a sequence for each product (we're talking about 20000 available
products so I think it is very big but it might give a really fast answer).
- Using standard SQL queries: SELECT the product, and if there are enough
units UPDATE to decrease the number of available ones. (This one I suppose
it's not very fast as two queries need to be processed for each product).
- Using a CURSOR or something like this which I'm not used to but I've seen
in the examples.
Should I have the queries saved in the database to encrease performance?
I hope I explained well enough :-) Thanks in advance!
-------------------------------------------------------
From | Date | Subject | |
---|---|---|---|
Next Message | Tom Lane | 2003-01-06 04:05:48 | Re: Concurrency issues |
Previous Message | Clarence Gardner | 2003-01-06 02:49:31 | Concurrency issues |