From: | pabloa98 <pabloa98(at)gmail(dot)com> |
---|---|
To: | Rob Sargent <robjsargent(at)gmail(dot)com> |
Cc: | "pgsql-generallists(dot)postgresql(dot)org" <pgsql-general(at)lists(dot)postgresql(dot)org> |
Subject: | Re: Could postgres12 support millions of sequences? (like 10 million) |
Date: | 2020-03-19 22:13:53 |
Message-ID: | CAEjudX5T5E_two14i_i19uGMkSs_mT4JG4hhXWoY3Z5M+GaXRQ@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On Thu, Mar 19, 2020 at 2:50 PM Rob Sargent <robjsargent(at)gmail(dot)com> wrote:
>
>
> > On Mar 19, 2020, at 3:36 PM, pabloa98 <pabloa98(at)gmail(dot)com> wrote:
> >
> > Hello,
> >
> > My schema requires a counter for each combination of 2 values. Something
> like:
> >
> > CREATE TABLE counter(
> > group INT NOT NULL,
> > element INT NOT NULL,
> > seq_number INT NOT NULL default 0,
> > CONSTRAINT PRIMARY KEY (group, element)
> > );
> >
> > For each entry in counter, aka for each (group, element) pair, the model
> requires a seq_number.
> >
> > If I use a table "counter", I could still have counter collisions
> between 2 transactions. I need truly sequence behavior. Is that possible by
> using a table like "counter" table, where the counter could be increased
> out of the transaction so it performs as a sequence without having race
> conditions between concurrent transactions?
> >
> > The other option is to create sequences for each new pair of (group,
> element) using triggers. There are millions of pairs. So this approach will
> generate millions of sequences.
> >
> > How a PostgreSQL database would behave having millions of sequences in a
> schema? Would it degrade its performance? Is there any negative impact?
> >
> > Regards
> >
> > Pablo
> >
>
> To clarify, are you hoping for consecutive numbers as the each row is
> added to the table, i.e. “serial”?
>
> What is the intension of “seq_number”?
> >
> >
>
>
the idea is to have like a serial sequence, but for each pair of (group,
element).
so that when we insert rows in another table, we could have something like:
group, element, event_id, ...
1, 1, 1
1, 1, 2
1, 1, 3
2, 1, 1
1, 1, 4
1, 3, 1
1, 1, 5
1, 3, 2
2, 1, 2
2, 1, 3
The 3rd column is the sequence number we get from the appropriate sequence
created by the trigger.
I want to implement a variation of https://stackoverflow.com/a/30204854 and
that will generate millions of sequences.
From | Date | Subject | |
---|---|---|---|
Next Message | Rob Sargent | 2020-03-19 22:17:46 | Re: Could postgres12 support millions of sequences? (like 10 million) |
Previous Message | Samuel Smith | 2020-03-19 22:12:13 | Re: Invalid byte sequence errors on DB restore |