From: | "Andrus" <kobruleht2(at)hot(dot)ee> |
---|---|
To: | "pgsql-general" <pgsql-general(at)postgresql(dot)org> |
Subject: | Why there is 30000 rows is sample |
Date: | 2020-04-04 07:07:51 |
Message-ID: | 3D72D8D12557453AB3E0EB46FB911C17@dell2 |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Hi!
vacuumdb output:
vacuumdb: vacuuming database "mydb"
INFO: analyzing "public.mytable"
INFO: "mytable": scanned 2709 of 2709 pages, containing 10834 live rows and 0 dead rows; 10834 rows in sample, 10834 estimated
total rows
For tables with more than 30000 rows, it shows that there are 30000 rows in sample.
postgresql.conf does not set default_statistics_target value.
It contains
#default_statistics_target = 100 # range 1-10000
So I expect that there should be 100 rows is sample.
Why Postgres uses 30000 or number of rows in table for smaller tables ?
Is 30000 some magical value, how to control it.
Using Postgres 12 in Debian.
Andrus.
From | Date | Subject | |
---|---|---|---|
Next Message | Julien Rouhaud | 2020-04-04 07:28:51 | Re: Why there is 30000 rows is sample |
Previous Message | postgann2020 s | 2020-04-04 03:02:03 | Re: Help to find-the-maximum-length-of-field-in-a-particular-column-in-all the tables |