Pg_dump very large database

From: "Nikolay Mihaylov" <pg(at)nmmm(dot)nu>
To: <pgsql-general(at)postgresql(dot)org>
Subject: Pg_dump very large database
Date: 2002-03-01 13:38:44
Message-ID: 005b01c1c126$68490b20$97e309d9@protos
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

Hi all, I have some problems to post this message in adin group, but the
problem is quite general.

-----Original Message-----
From: Nikolay Mihaylov [mailto:pg(at)nmmm(dot)nu]
Sent: Wednesday, February 27, 2002 10:01 AM
To: 'pgsql-admin(at)postgresql(dot)org'
Subject: FW: Pg_dump very large database

Hi all.
I have a database with very large tables - about 2+ GB per table.

When I'm using pg_dump tool, it get all memory available, then linux
crash (or kernel kills the most of processes including pg_dump)

For this reason I made small php script which dump data using 'fetch
next in cursor' (PHP is lame for this but is perfect for me).
I tried to patch pg_dump, but I cant understand most of the code (never
works with pg blobs from C).

I'm attaching the files I use in order to share them with all you.

Nikolay.

P.s.
Shell scripts are for calling the php scripts.
dump1.sh - create list with shell commands needs to be executed in order
to get backup.
dump.sh - backup single table

P.s.

Is there any interest for discussing very large pg database programming
and administration or is such list exist?

-----------------------------------------------------------
The Reboots are for hardware upgrades,
Found more here: http://www.nmmm.nu
Nikolay Mihaylov nmmm(at)nmmm(dot)nu

Attachment Content-Type Size
pgdump-nmmm.tar application/x-tar 10.0 KB

Browse pgsql-general by date

  From Date Subject
Next Message Johnson, Shaunn 2002-03-01 14:00:23 check sql progress
Previous Message Tommi Maekitalo 2002-03-01 13:17:10 Re: Sql basic Query