From: | Steve Crawford <scrawford(at)pinpointresearch(dot)com> |
---|---|
To: | Radcon Entec <radconentec(at)yahoo(dot)com> |
Cc: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: Table has 22 million records, but backup doesn't see them |
Date: | 2009-04-08 15:25:20 |
Message-ID: | 49DCC1E0.6090609@pinpointresearch.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Radcon Entec wrote:
> Greetings!
>
> I'm running PostgreSQL 8.1 under Windows XP, looking at a database
> hosted on a machine running PostgreSQL under Windows Server 2003.
>
> The database has a table with three simple columns and 22 million
> rows. I am trying to back up that table by itself. However, pg_dump
> finishes almost instantly, obviously not backing up any data from the
> table. I've tried it from the DOS command line with and without the
> -a (data only) option, and from inside PGAdmin. Can anyone suggest
> what might cause this behavior?
>
What is the exact command and what is the output (I'll be surprised if
there is no output at all to either stdout or stderr)? Does pg_dumpall
run fine from the same machine? How about psql? Are you sure you are
hitting a base-table and not a view? Do the server logs show anything
interesting?
Cheers,
Steve
From | Date | Subject | |
---|---|---|---|
Next Message | Ian Mayo | 2009-04-08 15:26:09 | Are there performance advantages in storing bulky field in separate table? |
Previous Message | Ivan Sergio Borgonovo | 2009-04-08 15:24:08 | Re: recovery after segmentation fault |