From: | Suhail Bamzena <suhailsalem(at)gmail(dot)com> |
---|---|
To: | Ron <ronljohnsonjr(at)gmail(dot)com> |
Cc: | pgsql-general(at)lists(dot)postgresql(dot)org |
Subject: | Re: Inherited an 18TB DB & need to backup |
Date: | 2020-05-16 00:20:00 |
Message-ID: | CAA7EztUo7rNpmrG83Wgwni9t8iABFaV2xuW-96wW+vMEgnsHuQ@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgeu-general pgsql-general |
Thanks Ron.. pgbackrest and barman seem to b good options..
On Sat, 16 May 2020, 02:26 Ron, <ronljohnsonjr(at)gmail(dot)com> wrote:
> For a database that size, I'd install pgbackrest, since it features
> parallel backups and compression. With it, I'd do monthly full backups
> with daily differential backups.
>
> (If it's mostly historical data, I'd split the database into multiple
> instances, so that older data rarely needs to be backed up. The
> application, of course, would have to be modified.)
>
> On 5/15/20 8:26 AM, Suhail Bamzena wrote:
>
> Thanks Rory, the machine has the capacity to pull through pg_dumps but
> like u rightly mentioned incremental backups mean that we will need to work
> with the wal's.. 18TB is what is the scary part and with compression I dont
> see it being less than 2TB a day...
>
> On Fri, 15 May 2020, 17:02 Rory Campbell-Lange, <rory(at)campbell-lange(dot)net>
> wrote:
>
>> On 15/05/20, Suhail Bamzena (suhailsalem(at)gmail(dot)com) wrote:
>> > Hello All,
>> > I have very recently inherited an 18 TB DB that is running version 9.2.
>> > Apparently this database has never been backed up and I have been
>> tasked to
>> > set in a periodic backup routine (weekly full & daily incremental) and
>> dump
>> > it into a NAS. What is the best way to go about this? Did some reading
>> and
>> > hear that pgbackrest does a good job with such huge sizes. Your expert
>> > advise is needed.
>>
>> Incremental backups suggest the need to backup WAL archives. See
>> https://www.postgresql.org/docs/9.2/continuous-archiving.html
>>
>> pgbackrest looks very cool but we haven't used it.
>>
>> A very simple solution could be just to dump the database daily with
>> pg_dump, if you have the space and machine capacity to do it. Depending
>> on what you are storing, you can achieve good compression with this, and
>> it is a great way of having a simple file from which to restore a
>> database.
>>
>> Our ~200GB cluster resolves to under 10GB of pg_dump files, although
>> 18TB is a whole different order of size.
>>
>> Rory
>>
>
> --
> Angular momentum makes the world go 'round.
>
From | Date | Subject | |
---|---|---|---|
Next Message | Ron | 2020-05-16 02:18:27 | Re: Inherited an 18TB DB & need to backup |
Previous Message | Michael Nolan | 2020-05-15 23:01:43 | Re: Inherited an 18TB DB & need to backup |
From | Date | Subject | |
---|---|---|---|
Next Message | Ron | 2020-05-16 02:18:27 | Re: Inherited an 18TB DB & need to backup |
Previous Message | Tom Lane | 2020-05-16 00:03:56 | Re: schema agnostic functions in language sql |