Could postgres12 support millions of sequences? (like 10 million)

From: pabloa98 <pabloa98(at)gmail(dot)com>
To: "pgsql-generallists(dot)postgresql(dot)org" <pgsql-general(at)lists(dot)postgresql(dot)org>
Subject: Could postgres12 support millions of sequences? (like 10 million)
Date: 2020-03-19 21:36:54
Message-ID: CAEjudX5UdEYJwSTYir07rUX10u3AdEyht_K=GWk_CyOChSFguQ@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

Hello,

My schema requires a counter for each combination of 2 values. Something
like:

CREATE TABLE counter(
group INT NOT NULL,
element INT NOT NULL,
seq_number INT NOT NULL default 0,
CONSTRAINT PRIMARY KEY (group, element)
);

For each entry in counter, aka for each (group, element) pair, the model
requires a seq_number.

If I use a table "counter", I could still have counter collisions between 2
transactions. I need truly sequence behavior. Is that possible by using a
table like "counter" table, where the counter could be increased out of the
transaction so it performs as a sequence without having race conditions
between concurrent transactions?

The other option is to create sequences for each new pair of (group,
element) using triggers. There are millions of pairs. So this approach will
generate millions of sequences.

How a PostgreSQL database would behave having millions of sequences in a
schema? Would it degrade its performance? Is there any negative impact?

Regards

Pablo

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Rob Sargent 2020-03-19 21:50:38 Re: Could postgres12 support millions of sequences? (like 10 million)
Previous Message Justin King 2020-03-19 16:55:13 Re: Fwd: PG12 autovac issues