From: | Grzegorz Jaśkiewicz <gryzman(at)gmail(dot)com> |
---|---|
To: | zxo102 ouyang <zxo102(at)gmail(dot)com> |
Cc: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: Very slow searching in a table with more than 10 millions recovered records from a backup file... |
Date: | 2009-06-12 09:03:08 |
Message-ID: | 2f4958ff0906120203h29975130j5a581da0bda5ae96@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On Fri, Jun 12, 2009 at 9:56 AM, zxo102 ouyang<zxo102(at)gmail(dot)com> wrote:
> Hi there,
> I have an application with a database (pgsql) which has a big table (>
> 10 millions records) in windows 2003. Some times, I need to install the new
> version of the application. Here is what I did: 1. back up the big table
> via pgadmin III, 2. stop the pgsql in the old version of the application,
> 3. install the new version of the application (pgsql is included and all
> tables keep same like before) and 4. recovering the data(> 10 millions
> records) into the table from the backup file.
> After I restart the application, searching the table becomes very very
> slow (much slower than the searching in the old version). I don't know what
> is wrong with it. pgsql needs time to "reindexing" those 10 millions records
> for the searching?
This is because you missed vacuum analyze in those steps, that should
be done right after restore.
--
GJ
From | Date | Subject | |
---|---|---|---|
Next Message | Yaroslav Tykhiy | 2009-06-12 09:53:09 | Re: How to store text files in the postgresql? |
Previous Message | zxo102 ouyang | 2009-06-12 08:56:15 | Very slow searching in a table with more than 10 millions recovered records from a backup file... |