From: | Jeff Janes <jeff(dot)janes(at)gmail(dot)com> |
---|---|
To: | Joshua Berkus <josh(at)agliodbs(dot)com> |
Cc: | Stephen Frost <sfrost(at)snowman(dot)net>, pgsql-hackers(at)postgresql(dot)org |
Subject: | Re: Potential autovacuum optimization: new tables |
Date: | 2012-10-13 20:04:07 |
Message-ID: | CAMkU=1wmyoUNVdaG__91j9iNFkfPB-yfi22R0ckOr0tRmACN5Q@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-hackers |
On Sat, Oct 13, 2012 at 12:49 PM, Joshua Berkus <josh(at)agliodbs(dot)com> wrote:
>
> So, problem #1 is coming up with a mathematical formula. My initial target values are in terms of # of rows in the table vs. # of writes before analyze is triggered:
>
> 1 : 3
> 10 : 5
> 100 : 10
> 1000 : 100
> 100000 : 2000
> 1000000 : 5000
> 10000000 : 25000
> 100000000 : 100000
>
> .... etc. So problem #1 is a mathematical formula which gives this kind of curve. I've tried some solution-seeking software, but I don't know how to use it well enough to get something useful.
That is close to a power law, where best fit is about "threshold = 1.5
* (rows ** 0.6)"
rows yours powerfit
1.00E+00 3.00E+00 1.50E+00
1.00E+01 5.00E+00 5.97E+00
1.00E+02 1.00E+01 2.38E+01
1.00E+03 1.00E+02 9.46E+01
1.00E+05 2.00E+03 1.50E+03
1.00E+06 5.00E+03 5.97E+03
1.00E+07 2.50E+04 2.38E+04
1.00E+08 1.00E+05 9.46E+04
If you want something more natural, reduce the exponent from 0.6 to
0.5 so it becomes the square root.
I have no opinion on the suitability of this, I'm just crunching the
numbers for you.
Cheers,
Jeff
From | Date | Subject | |
---|---|---|---|
Next Message | Tom Lane | 2012-10-13 20:05:53 | Re: Potential autovacuum optimization: new tables |
Previous Message | Jim Nasby | 2012-10-13 19:54:28 | Re: Optimizer regression |