From: | Chris Travers <chris(dot)travers(at)gmail(dot)com> |
---|---|
To: | Chitra Creta <chitracreta(at)gmail(dot)com> |
Cc: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: Strategies/Best Practises Handling Large Tables |
Date: | 2012-10-17 13:47:24 |
Message-ID: | CAKt_Zfs64hDKsJTTj0YkYLLE7W38KKbp6pDVJ8edj=U-LcBeTA@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On Fri, Oct 12, 2012 at 7:44 AM, Chitra Creta <chitracreta(at)gmail(dot)com> wrote:
> Hi,
>
> I currently have a table that is growing very quickly - i.e 7 million
> records in 5 days. This table acts as a placeholder for statistics, and
> hence the records are merely inserted and never updated or deleted.
>
> Many queries are run on this table to obtain trend analysis. However,
> these queries are now starting to take a very long time (hours) to execute
> due to the size of the table.
>
> I have put indexes on this table, to no significant benefit. Some of the
> other strategies I have thought of:
> 1. Purge old data
> 2. Reindex
> 3. Partition
> 4. Creation of daily, monthly, yearly summary tables that contains
> aggregated data specific to the statistics required
>
> Does anyone know what is the best practice to handle this situation?
>
The answer is well, it depends. Possibly some combination.
One approach I like that may be included in #4 but not necessarily is the
idea of summary tables which contain snapshots of the data, allowing you to
roll forward or backward from defined points. This is what I call the log,
aggregate, and snapshot approach. But it really depends on what you are
doing and there is no one size fits all approach at this volume.
Instead of reindexing, I would suggest also looking into partial indexes.
Best Wishes,
From | Date | Subject | |
---|---|---|---|
Next Message | Vincent Veyron | 2012-10-17 13:56:23 | Re: PostgreSQL training recommendations? |
Previous Message | Fathi Ben Nasr | 2012-10-17 13:36:27 | Re: PostgreSQL Magazine |