From: | Christopher Browne <cbbrowne(at)acm(dot)org> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: Thousands of parallel connections |
Date: | 2004-08-16 14:54:50 |
Message-ID: | m3smanw2th.fsf@wolfe.cbbrowne.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Centuries ago, Nostradamus foresaw when peter_e(at)gmx(dot)net (Peter Eisentraut) would write:
> Is there any practical limit on the number of parallel connections that a
> PostgreSQL server can service? We're in the process of setting up a system
> that will require up to 10000 connections open in parallel. The query load
> is not the problem, but we're wondering about the number of connections.
> Does anyone have experience with these kinds of numbers?
We commonly have a thousand connections open, on some servers, and
while it works, we consider there to be something problematic about
it. It tends to lead to using spinlocks a lot.
You might want to look into pgpool:
<http://www2b.biglobe.ne.jp/~caco/pgpool/index-e.html>
Jan Wieck has tried it out with his version of the TPC-W benchmark,
and found that it allowed cutting down on the _true_ number of
connections, and was very helpful in improving performance under
conditions where the application imagined it needed a lot of
connections.
--
(reverse (concatenate 'string "gro.gultn" "@" "enworbbc"))
http://www.ntlug.org/~cbbrowne/spiritual.html
"The last good thing written in C was Franz Schubert's Symphony number
9." -- Erwin Dieterich
From | Date | Subject | |
---|---|---|---|
Next Message | Csaba Nagy | 2004-08-16 15:00:47 | Re: Thousands of parallel connections |
Previous Message | Ben | 2004-08-16 14:52:25 | Re: Thousands of parallel connections |