From: | Kobus Wolvaardt <kobuswolf(at)gmail(dot)com> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | Postgres memory question |
Date: | 2009-08-09 15:53:44 |
Message-ID: | 3bea3b5f0908090853q484fcfc8r181c2109fe949656@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Hi,
We have software deployed on our network that need postgres, we have server
that hosts the server and all worked fine until we crossed about 200 users.
The application is written so that it makes a connection right at the start
and keeps it alive for the duration of the app. The app is written in
Delphi. The postgres server runs on a windows 2008 server with quad core cpu
and 4 GB of ram.
The problem after +-200 connections is that the server runs out of memory,
but most of these connections are idle... it only gets used every 20 minutes
to capture a transaction.
It looks like every idle connection uses about 10MB of ram which sees high,
but I cannot find a config option to limit it.
I tried pgbouncer to do connection pooling, but for each connection to
pgbouncer one connection is made to the server which results in exactly the
same amount of connection. If I run it in transaction pooling mode it works
for simple queries, but something goes lost says the programmer (views that
were setup or something).
Any help or pointers would be nice, either on how to make usage less, or on
how to get pooling to work.
Thanks,
Kobus Wolvaardt
P.S. We are growing the users by another 20% soon and the will result in
massive issues. I don't mind slower operation for now, I just need to keep
it working.
From | Date | Subject | |
---|---|---|---|
Next Message | Scott Ribe | 2009-08-09 16:16:34 | Re: libpq |
Previous Message | Camilo Sperberg | 2009-08-09 12:42:23 | Re: bigint to ipaddress |