From: | Mischa <mischa_Sandberg(at)telus(dot)net> |
---|---|
To: | pgsql-sql(at)postgresql(dot)org |
Subject: | Re: same question little different test MSSQL vrs Postgres |
Date: | 2005-01-29 05:58:54 |
Message-ID: | 1106978334.41fb261e28ac2@webmail.telus.net |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-sql |
Quoting Dennis Sacks <dennis(at)illusions(dot)com>:
> Bruno Wolff III wrote:
> >On Tue, Jan 25, 2005 at 21:21:08 -0700,
> > Dennis Sacks <dennis(at)illusions(dot)com> wrote:
> >>One of the things you'll want to do regularly is run a "vacuum analyze".
> >>You can read up on this in the postgresql docs. This is essential to the
> >>indexes being used properly. At a bare minimum, after you import a large
> >>amount of data, you'll want to run vacuum analyze.
> >
> Good point! Analyze after bulk inserts, vacuum analyze after
> updates/deletes and inserts. :)
Hmmm ... in performance tests of bulk inserts into a table with six indexes, I
found that without vacuum analyze (rather than analyze), insertion slowed down
albeit something less than linearly. Table of 6M rows, about 3GB (including
index files).
This is 7.4.1 on SuSE Linux, RAID5 Xeon(sigh) 2.8GHz 4GB nothing else running.
The inserts were always done with an existing record check (LEFT JOIN ... WHERE
joinkey IS NULL).
--
"Dreams come true, not free."
From | Date | Subject | |
---|---|---|---|
Next Message | Don Drake | 2005-01-30 19:41:28 | plpgsql functions and NULLs |
Previous Message | PFC | 2005-01-28 21:11:49 | Re: plpgsql select into with multiple target variables |