From: | "Nikolay Mihaylov" <pg(at)nmmm(dot)nu> |
---|---|
To: | <pgsql-admin(at)postgresql(dot)org> |
Subject: | FW: Pg_dump very large database |
Date: | 2002-02-27 08:00:39 |
Message-ID: | 001501c1bf64$d87f9800$97e309d9@protos |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-admin |
Hi all.
I have a database with very large tables - about 2+ GB per table.
When I'm using pg_dump tool, it get all memory available, then linux
crash (or kernel kills the most of processes including pg_dump)
For this reason I made small php script which dump data using 'fetch
next in cursor' (PHP is lame for this but is perfect for me).
I tried to patch pg_dump, but I cant understand most of the code (never
works with pg blobs from C).
I'm attaching the files I use in order to share them with all you.
Nikolay.
P.s.
Shell scripts are for calling the php scripts.
dump1.sh - create list with shell commands needs to be executed in order
to get backup.
dump.sh - backup single table
P.s.
Is there any interest for discussing very large pg database programming
and administration or is such list exist?
-----------------------------------------------------------
The Reboots are for hardware upgrades,
Found more here: http://www.nmmm.nu
Nikolay Mihaylov nmmm(at)nmmm(dot)nu
Attachment | Content-Type | Size |
---|---|---|
pgdump-nmmm.tar | application/x-tar | 10.0 KB |
From | Date | Subject | |
---|---|---|---|
Next Message | Raphael Bauduin | 2002-02-27 11:03:49 | syncing MSSQL and Postgres |
Previous Message | Nikolay Mihaylov | 2002-02-27 07:54:11 | Pg_dump very large database |