From: | Rory Campbell-Lange <rory(at)campbell-lange(dot)net> |
---|---|
To: | Suhail Bamzena <suhailsalem(at)gmail(dot)com> |
Cc: | pgsql-general(at)lists(dot)postgresql(dot)org, pgeu-general(at)lists(dot)postgresql(dot)org |
Subject: | Re: Inherited an 18TB DB & need to backup |
Date: | 2020-05-15 13:02:46 |
Message-ID: | 20200515130246.GB25570@campbell-lange.net |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgeu-general pgsql-general |
On 15/05/20, Suhail Bamzena (suhailsalem(at)gmail(dot)com) wrote:
> Hello All,
> I have very recently inherited an 18 TB DB that is running version 9.2.
> Apparently this database has never been backed up and I have been tasked to
> set in a periodic backup routine (weekly full & daily incremental) and dump
> it into a NAS. What is the best way to go about this? Did some reading and
> hear that pgbackrest does a good job with such huge sizes. Your expert
> advise is needed.
Incremental backups suggest the need to backup WAL archives. See
https://www.postgresql.org/docs/9.2/continuous-archiving.html
pgbackrest looks very cool but we haven't used it.
A very simple solution could be just to dump the database daily with
pg_dump, if you have the space and machine capacity to do it. Depending
on what you are storing, you can achieve good compression with this, and
it is a great way of having a simple file from which to restore a
database.
Our ~200GB cluster resolves to under 10GB of pg_dump files, although
18TB is a whole different order of size.
Rory
From | Date | Subject | |
---|---|---|---|
Next Message | Christoph Berg | 2020-05-15 13:08:05 | Re: Inherited an 18TB DB & need to backup |
Previous Message | Suhail Bamzena | 2020-05-15 07:09:20 | Inherited an 18TB DB & need to backup |
From | Date | Subject | |
---|---|---|---|
Next Message | Christoph Berg | 2020-05-15 13:08:05 | Re: Inherited an 18TB DB & need to backup |
Previous Message | Peter Eisentraut | 2020-05-15 12:31:50 | Re: Seamless Logical Replication during Fail-over |