Re: Backup Large Tables

From: "Charles Ambrose" <jamjam360(at)gmail(dot)com>
To: "Michael Nolan" <htfoot(at)gmail(dot)com>
Cc: pgsql-general(at)postgresql(dot)org
Subject: Re: Backup Large Tables
Date: 2006-09-22 04:19:52
Message-ID: 61ca079e0609212119u5d44b14dh39bf28f76374942d@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

Hi!

I encounter errors in dumping the database using pg_dump. The database i
think is corrupt. It was looking for triggers and stored procedures that are
now longer in the database. This is also the reason why I opted to create a
program to dump the database.

On 9/22/06, Michael Nolan <htfoot(at)gmail(dot)com> wrote:
>
> I have a table with over 6 million rows in it that I do a dump on every
> night. It takes less than 2 minutes to create a file that is around 650 MB.
>
> Are you maybe dumping this file in 'insert' mode?
> --
> Mike Nolan
>
> On 9/21/06, Charles Ambrose <jamjam360(at)gmail(dot)com> wrote:
> >
> > Hi!
> >
> > I have a fairly large database tables (say an average of 3Million to
> > 4Million records). Using the pg_dump utility takes forever to dump the
> > database tables. As an alternative, I have created a program that gets all
> > the data from the table and then put it into a text file. I was also
> > unsuccessfull in this alternative to dump the database.
> >
> >
> >
>

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Shane Ambler 2006-09-22 05:24:50 Re: After Trigger
Previous Message Jim Nasby 2006-09-22 03:53:06 Re: The Best Postgresql Load Balancing Solution