From: | Albert Cervera Areny <albertca(at)jazzfree(dot)com> |
---|---|
To: | pgsql-performance(at)postgresql(dot)org |
Subject: | Fwd: Stock update like application |
Date: | 2003-01-04 13:31:51 |
Message-ID: | 200301041431.51423.albertca@jazzfree.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-performance |
Hi!
I'm developing a small application which I'd like to be as fast as
possible. The program simply receives an order by modem and has to give an
answer with the products my enterprise will be able to send them. The number
of products could be as much as 300-400 and I don't want to make my clients
put I high time-out before the answer is sended.
I do also need to use transactions as I start calculating before the whole
order has been received and if an error occurs everything has to be rolled
back.
Under this circumstances which way do you think it would be faster?
- Make a sequence for each product (we're talking about 20000 available
products so I think it is very big but it might give a really fast answer).
- Using standard SQL queries: SELECT the product, and if there are enough
units UPDATE to decrease the number of available ones. (This one I suppose
it's not very fast as two queries need to be processed for each product).
- Using a CURSOR or something like this which I'm not used to but I've seen
in the examples.
Should I have the queries saved in the database to encrease performance?
I hope I explained well enough :-) Thanks in advance!
From | Date | Subject | |
---|---|---|---|
Next Message | yutaka_inada | 2003-01-06 02:27:17 | Re: executing pgsql on Xeon-dual machine |
Previous Message | Hilmar Lapp | 2003-01-03 22:22:49 | Re: join over 12 tables takes 3 secs to plan |