From: | Saurav Sarkar <saurav(dot)sarkar1(at)gmail(dot)com> |
---|---|
To: | PostgreSQL General <pgsql-general(at)lists(dot)postgresql(dot)org> |
Subject: | Connection queuing by connection pooling libraries |
Date: | 2021-10-19 17:15:23 |
Message-ID: | CAP+kwAXHTJaYcj89nUfSjHe=xy+ede5gYASd0vAKi=2mjk7How@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Hi All,
A basic question on handling large number of concurrent requests on DB.
I have a cloud service which can get large of requests which will obviously
trigger the db operations.
Every db will have some max connection limit which can get exhausted on
large number of requests.
I know db connection pooling can be used to reuse the connections but it
will not help when there are large number of active concurrent connections.
My queries are already optimised and short living.
For that i need some queuing mechanism like pgbouncer for postgres
https://www.percona.com/blog/2021/02/26/connection-queuing-in-pgbouncer-is-it-a-magical-remedy/
pgbounder i understand is a proxy which needs to be separately installed on
the web or db server.
I was thinking if the normal client side db connection pooling libraries
like Apache DBCP , can also provide similar connection queuing while
running in the application runtime.
Or is there some other way to handle this problem ?
Best Regards,
Saurav
From | Date | Subject | |
---|---|---|---|
Next Message | Vijaykumar Jain | 2021-10-19 17:39:37 | Re: Connection queuing by connection pooling libraries |
Previous Message | Mladen Gogala | 2021-10-19 13:24:40 | Re: Force re-compression with lz4 |