From: | Ron <ronljohnsonjr(at)gmail(dot)com> |
---|---|
To: | pgsql-general(at)lists(dot)postgresql(dot)org |
Subject: | Re: Inherited an 18TB DB & need to backup |
Date: | 2020-05-15 22:25:55 |
Message-ID: | e1e0eb54-a5f1-1986-f55f-8740d88f489f@gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgeu-general pgsql-general |
For a database that size, I'd install pgbackrest, since it features parallel
backups and compression. With it, I'd do monthly full backups with daily
differential backups.
(If it's mostly historical data, I'd split the database into multiple
instances, so that older data rarely needs to be backed up. The application,
of course, would have to be modified.)
On 5/15/20 8:26 AM, Suhail Bamzena wrote:
> Thanks Rory, the machine has the capacity to pull through pg_dumps but
> like u rightly mentioned incremental backups mean that we will need to
> work with the wal's.. 18TB is what is the scary part and with compression
> I dont see it being less than 2TB a day...
>
> On Fri, 15 May 2020, 17:02 Rory Campbell-Lange, <rory(at)campbell-lange(dot)net
> <mailto:rory(at)campbell-lange(dot)net>> wrote:
>
> On 15/05/20, Suhail Bamzena (suhailsalem(at)gmail(dot)com
> <mailto:suhailsalem(at)gmail(dot)com>) wrote:
> > Hello All,
> > I have very recently inherited an 18 TB DB that is running version 9.2.
> > Apparently this database has never been backed up and I have been
> tasked to
> > set in a periodic backup routine (weekly full & daily incremental)
> and dump
> > it into a NAS. What is the best way to go about this? Did some
> reading and
> > hear that pgbackrest does a good job with such huge sizes. Your expert
> > advise is needed.
>
> Incremental backups suggest the need to backup WAL archives. See
> https://www.postgresql.org/docs/9.2/continuous-archiving.html
>
> pgbackrest looks very cool but we haven't used it.
>
> A very simple solution could be just to dump the database daily with
> pg_dump, if you have the space and machine capacity to do it. Depending
> on what you are storing, you can achieve good compression with this, and
> it is a great way of having a simple file from which to restore a
> database.
>
> Our ~200GB cluster resolves to under 10GB of pg_dump files, although
> 18TB is a whole different order of size.
>
> Rory
>
--
Angular momentum makes the world go 'round.
From | Date | Subject | |
---|---|---|---|
Next Message | Michael Nolan | 2020-05-15 23:01:43 | Re: Inherited an 18TB DB & need to backup |
Previous Message | Christophe Pettus | 2020-05-15 22:07:42 | Re: Inherited an 18TB DB & need to backup |
From | Date | Subject | |
---|---|---|---|
Next Message | Rob Sargent | 2020-05-15 22:26:58 | schema agnostic functions in language sql |
Previous Message | Christophe Pettus | 2020-05-15 22:07:42 | Re: Inherited an 18TB DB & need to backup |