From: | Shelby Cain <alyandon(at)yahoo(dot)com> |
---|---|
To: | Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us> |
Cc: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: Memory usage during vacuum |
Date: | 2004-03-25 18:49:28 |
Message-ID: | 20040325184928.46721.qmail@web41613.mail.yahoo.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
I had thought that I had dropped and reloaded this
table but apparently I hadn't and I had set the
statistics target for one column to 500 while
experimenting. Resetting it to -1 and running with a
default of 300 gets ~ 70 megs memory footprint during
the analyze now.
Thanks Tom for indulging my curiosity on the matter.
I've learned something that I didn't readily pick up
from reading the documentation.
Regards,
Shelby Cain
--- Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us> wrote:
> Shelby Cain <alyandon(at)yahoo(dot)com> writes:
> > It still decided to sample 150000 rows. Am I
> missing
> > something obvious here? Shouldn't fewer rows be
> > sampled when I set the collection target to 1?
>
> The sample size is 300 rows times the largest
> per-column analysis
> target, where default_statistics_target is used if
> the recorded
> per-column setting is -1. I would say that you have
> set a target of 500
> for at least one of the columns of that table, using
> ALTER TABLE SET
> STATISTICS. Try this to see which:
>
> select attname, attstattarget from pg_attribute
> where attrelid = 'table_name_here'::regclass;
>
> regards, tom lane
__________________________________
Do you Yahoo!?
Yahoo! Finance Tax Center - File online. File on time.
http://taxes.yahoo.com/filing.html
From | Date | Subject | |
---|---|---|---|
Next Message | Azeem M. Suleman | 2004-03-25 19:33:22 | Geodata type... |
Previous Message | Vivek Khera | 2004-03-25 18:45:15 | Re: Perl DBIx::SearchBuilder and pg_atoi: zero-length string? |