From: | Rob Sargent <robjsargent(at)gmail(dot)com> |
---|---|
To: | pabloa98 <pabloa98(at)gmail(dot)com> |
Cc: | "pgsql-generallists(dot)postgresql(dot)org" <pgsql-general(at)lists(dot)postgresql(dot)org> |
Subject: | Re: Could postgres12 support millions of sequences? (like 10 million) |
Date: | 2020-03-19 21:50:38 |
Message-ID: | B6FC3723-FF9D-4E5E-8506-3BF9421825DF@gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
> On Mar 19, 2020, at 3:36 PM, pabloa98 <pabloa98(at)gmail(dot)com> wrote:
>
> Hello,
>
> My schema requires a counter for each combination of 2 values. Something like:
>
> CREATE TABLE counter(
> group INT NOT NULL,
> element INT NOT NULL,
> seq_number INT NOT NULL default 0,
> CONSTRAINT PRIMARY KEY (group, element)
> );
>
> For each entry in counter, aka for each (group, element) pair, the model requires a seq_number.
>
> If I use a table "counter", I could still have counter collisions between 2 transactions. I need truly sequence behavior. Is that possible by using a table like "counter" table, where the counter could be increased out of the transaction so it performs as a sequence without having race conditions between concurrent transactions?
>
> The other option is to create sequences for each new pair of (group, element) using triggers. There are millions of pairs. So this approach will generate millions of sequences.
>
> How a PostgreSQL database would behave having millions of sequences in a schema? Would it degrade its performance? Is there any negative impact?
>
> Regards
>
> Pablo
>
To clarify, are you hoping for consecutive numbers as the each row is added to the table, i.e. “serial”?
What is the intension of “seq_number”?
>
>
From | Date | Subject | |
---|---|---|---|
Next Message | Samuel Smith | 2020-03-19 22:12:13 | Re: Invalid byte sequence errors on DB restore |
Previous Message | pabloa98 | 2020-03-19 21:36:54 | Could postgres12 support millions of sequences? (like 10 million) |