From: | Keith <keith(at)keithf4(dot)com> |
---|---|
To: | Rohit Rajput <rht(dot)rajput(at)yahoo(dot)com> |
Cc: | "pgsql-admin(at)lists(dot)postgresql(dot)org" <pgsql-admin(at)lists(dot)postgresql(dot)org> |
Subject: | Re: Issue with delete |
Date: | 2021-07-12 20:30:59 |
Message-ID: | CAHw75vum=nO9FrJSOPBa_gg-Qi-F+becksyteHEzNhaBZ+X4Bw@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-admin |
On Thu, Jun 24, 2021 at 5:08 AM Rohit Rajput <rht(dot)rajput(at)yahoo(dot)com> wrote:
> Good day everyone,
>
> I have few tables where i need to delete above 50 million rows each. Since
> i dont want to delete them all in one shot and block my table for extended
> time, i am trying to use limit with delete but its giving me error every
> time. May i please get some assistance here? Thanks
>
> Here is what i am running:
>
> delete from mytable where id in (select id from mytable where
> condition=true limit 1000);
>
> where i run this, i get following error:
>
> Error: current transaction is aborted, commands ignored until end of
> transaction block.
> SQL state: 25P02.
>
>
> I am running it in RDS.
>
>
> TIA, Best,
> Rohit
>
As others said, there is no limit with the UPDATE/DELETE statements in
PostgreSQL. However, you can get around that using a common table
expression (CTE). We have a blog post on doing exactly this up at
CrunchyData
Keith
From | Date | Subject | |
---|---|---|---|
Next Message | Prasanth M Sasidharan | 2021-07-13 13:12:40 | Fwd: Help on installing pgadmin4 from rpm |
Previous Message | Prasanth M Sasidharan | 2021-07-12 15:47:51 | Help on installing pgadmin4 from rpm |