Re: sensible configuration of max_connections

From: Justin <zzzzz(dot)graf(at)gmail(dot)com>
To: Chris Ellis <chris(at)intrbiz(dot)com>
Cc: "pgsql-generallists(dot)postgresql(dot)org" <pgsql-general(at)lists(dot)postgresql(dot)org>
Subject: Re: sensible configuration of max_connections
Date: 2020-02-07 13:29:30
Message-ID: CALL-XeMg_KPpphMUkHZmNnW4KWGp742DbX49OWLLRLRbsxCUxg@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

Hi Chris Withers

As stated each connection uses X amount of resources and its very easy to
configure Postgresql where even small number of connections will each up
all the RAM

WorkMem is the biggest consumer of resources lets say its set to 5 megs
per connection at 1000 connections that 5,000 megs that can be allocated.

Connection pooler may or may not work depends on how security is laid out
in the Application if this is a valid option... If the application
understands how to keep tenants out of each other data with no leaks then
yes, if the each tenant is assigned shema or specific database connection
pooler can still work but the configuration is going to difficult,

On Fri, Feb 7, 2020 at 7:50 AM Chris Ellis <chris(at)intrbiz(dot)com> wrote:

> Hi Chris
>
> On Fri, 7 Feb 2020, 08:36 Chris Withers, <chris(at)withers(dot)org> wrote:
>
>> Hi All,
>>
>> What's a sensible way to pick the number to use for max_connections?
>>
>
> Sensible in this context is some what variable. Each connection in
> PostgreSQL will be allocated a backend process. These are not the lightest
> weight of things.
>
> Each connection takes up space in shared memory, as mentioned in the
> manual.
>
>
>> I'm looking after a reasonable size multi-tenant cluster, where the
>> master handles all the load and there's a slave in case of hardware
>> failure in the master.
>> The machine is used to host what I suspect are mainly django
>> applications, so lots of short web requests, not sure how much, if any,
>> django's orm does connection pooling.
>>
>> I arbitrarily picked 1000 for max_connections and haven't had any
>> problems, but onboarding another app that handles a few million requests
>> per day on Monday and thinking about potential problems related to the
>> number of available connections.
>>
>> What's "too much" for max_connections? What happens when you set it to
>> high? What factors affect that number?
>>
>
> When sizing max_connections you need to trade off how many connections
> your application will use at peak vs how much RAM and CPU you have.
>
> Each connection is capable of allocating work_mem and has a stack etc.
>
> As such you don't want max_connections to be able to run your system out
> of RAM.
>
> Given your situation I'd very seriously look at connection pooling using
> PgBouncer or similar. That way you can run with a far smaller
> max_connections and still cope with applications configured with large
> usually idle connection pools.
>
>
>> cheers,
>>
>> Chris
>>
>
> Regards,
> Chris Ellis
>
>>

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Steve Atkins 2020-02-07 13:35:45 Re: Fwd: sensible configuration of max_connections
Previous Message Chris Withers 2020-02-07 13:18:38 Re: Fwd: sensible configuration of max_connections