From: | Jón Ragnarsson <jonr(at)physicallink(dot)com> |
---|---|
To: | |
Cc: | pgsql-performance(at)postgresql(dot)org |
Subject: | Re: 100 simultaneous connections, critical limit? |
Date: | 2004-01-14 13:44:17 |
Message-ID: | 400547B1.7000202@physicallink.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-performance |
Ok, connection pooling was the thing that I thought of first, but I
haven't found any docs regarding pooling with PHP+Postgres.
OTOH, I designed the application to be as independent from the DB as
possible. (No stored procedures or other Postgres specific stuff)
Thanks,
J.
Christopher Browne wrote:
> Clinging to sanity, jonr(at)physicallink(dot)com (Jón Ragnarsson) mumbled into her beard:
>
>>I am writing a website that will probably have some traffic.
>>Right now I wrap every .php page in pg_connect() and pg_close().
>>Then I read somewhere that Postgres only supports 100 simultaneous
>>connections (default). Is that a limitation? Should I use some other
>>method when writing code for high-traffic website?
>
>
> I thought the out-of-the-box default was 32.
>
> If you honestly need a LOT of connections, you can configure the
> database to support more. I "upped the limit" on one system to have
> 512 the other week; certainly supportable, if you have the RAM for it.
>
> It is, however, quite likely that the connect()/close() cuts down on
> the efficiency of your application. If PHP supports some form of
> "connection pooling," you should consider using that, as it will cut
> down _dramatically_ on the amount of work done establishing/closing
> connections, and should let your apps use somewhat fewer connections
> more effectively.
From | Date | Subject | |
---|---|---|---|
Next Message | Nick Barr | 2004-01-14 14:00:52 | Re: 100 simultaneous connections, critical limit? |
Previous Message | CoL | 2004-01-14 13:35:25 | subquery and table join, index not use for table |