Re: Strategies/Best Practises Handling Large Tables

From: Igor Romanchenko <igor(dot)a(dot)romanchenko(at)gmail(dot)com>
To: Chitra Creta <chitracreta(at)gmail(dot)com>
Cc: Chris Travers <chris(dot)travers(at)gmail(dot)com>, pgsql-general(at)postgresql(dot)org
Subject: Re: Strategies/Best Practises Handling Large Tables
Date: 2012-11-15 14:40:28
Message-ID: CAP95Gq=ypN-mapDnYBnXeGfJUMgXtyonfLQ8j=yC3r_tGNiPDQ@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

On Thu, Nov 15, 2012 at 1:34 PM, Chitra Creta <chitracreta(at)gmail(dot)com> wrote:

> Thanks for your example Chris. I will look into it as a long-term solution.
>
> Partitioning tables as a strategy worked very well indeed. This will be my
> short/medium term solution.
>
> Another strategy that I would like to evaluate as a short/medium term
> solution is archiving old records in a table before purging them.
>
> I am aware that Oracle has a tool that allows records to be exported into
> a file / archive table before purging them. They also provide a tool to
> import these records.
>
> Does PostgreSQL have similar tools to export to a file and re-import?
>
> If PostgreSQL does not have a tool to do this, does anyone have any ideas
> on what file format (e.g. text file containing a table of headers being
> column names and rows being records) would be ideal for easy re-importing
> into a PostgreSQL table?
>
> Thank you for your ideas.
>

PostgreSQL has COPY TO to export records to a file (
http://wiki.postgresql.org/wiki/COPY ).

In response to

Browse pgsql-general by date

  From Date Subject
Next Message Sébastien Lardière 2012-11-15 17:27:15 Plproxy with returns table() make PG segfault
Previous Message sk baji 2012-11-15 13:13:13 Re: How to list all schema names inside a PostgreSQL database through SQL