Re: Best way to delete big amount of records from big table

From: Ekaterina Amez <ekaterina(dot)amez(at)zunibal(dot)com>
To: Michael Lewis <mlewis(at)entrata(dot)com>
Cc: postgres performance list <pgsql-performance(at)postgresql(dot)org>
Subject: Re: Best way to delete big amount of records from big table
Date: 2020-03-27 14:55:27
Message-ID: CAFijohh6BvLirtr_7RRsR9n0EC9ytFBFVa0OwGeeNsN=vv716g@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-performance

Sorry, I sent my response only to you, I'm sending it again to the group in
a minute...

El vie., 27 mar. 2020 a las 15:41, Michael Lewis (<mlewis(at)entrata(dot)com>)
escribió:

> If you can afford the time, I am not sure the reason for the question.
> Just run it and be done with it, yes?
>
> A couple of thoughts-
> 1) That is a big big transaction if you are doing all the cleanup in a
> single function call. Will this be a production system that is still online
> for this archiving? Having a plpgsql function that encapsulates the work
> seems fine, but I would limit the work to a month at a time or something
> and call the function repeatedly. Get the min month where records exist
> still, delete everything matching that, return. Rinse, repeat.
> 2) If you are deleting/moving most of the table (91 of 150 million),
> consider moving only the records you are keeping to a new table, renaming
> old table, and renaming new table back to original name. Then you can do
> what you want to shift the data in the old table and delete it.
>

In response to

Browse pgsql-performance by date

  From Date Subject
Next Message Ekaterina Amez 2020-03-27 14:56:47 Re: Best way to delete big amount of records from big table
Previous Message Laurenz Albe 2020-03-27 14:46:57 Re: Best way to delete big amount of records from big table