Strategies/Best Practises Handling Large Tables

From: Chitra Creta <chitracreta(at)gmail(dot)com>
To: pgsql-general(at)postgresql(dot)org
Subject: Strategies/Best Practises Handling Large Tables
Date: 2012-10-12 14:44:02
Message-ID: CABkVLeOVhMS6oubnyr4HrcHYtgF2WNScw9Vjv8wqWaOANFu7nQ@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

Hi,

I currently have a table that is growing very quickly - i.e 7 million
records in 5 days. This table acts as a placeholder for statistics, and
hence the records are merely inserted and never updated or deleted.

Many queries are run on this table to obtain trend analysis. However, these
queries are now starting to take a very long time (hours) to execute due to
the size of the table.

I have put indexes on this table, to no significant benefit. Some of the
other strategies I have thought of:
1. Purge old data
2. Reindex
3. Partition
4. Creation of daily, monthly, yearly summary tables that contains
aggregated data specific to the statistics required

Does anyone know what is the best practice to handle this situation?

I would appreciate knowledge sharing on the pros and cons of the above, or
if there are any other strategies that I could put in place.

Thanking you in advance.

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Lonni J Friedman 2012-10-12 15:00:08 Re: Strategies/Best Practises Handling Large Tables
Previous Message Alban Hertroys 2012-10-12 10:33:52 Re: Limit+Offset query wrong result in Postgres 9.0.3 ?