From: | Andreas Joseph Krogh <andreas(at)visena(dot)com> |
---|---|
To: | DrakoRod <drakoflames(at)hotmail(dot)com> |
Cc: | pgsql-general(at)postgresql(dot)org |
Subject: | Sv: DELETE Query Hang |
Date: | 2019-11-12 23:20:36 |
Message-ID: | VisenaEmail.56.c3b346f6a24e6ccf.16e61e9df08@tc7-visena |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
På tirsdag 12. november 2019 kl. 23:47:18, skrev DrakoRod <
drakoflames(at)hotmail(dot)com <mailto:drakoflames(at)hotmail(dot)com>>:
Hi folks!
I Have a question, in a database are a table with many files (bytea) stored
(I support this database a don't design it), but we need delete many rows
(38000 rows approx), but I when execute query:
BEGIN;
ALTER TABLE my_file_table DISABLE TRIGGER ALL;
DELETE FROM my_file_table WHERE id_table <> 230;
This query hang... 50 minutes and the query do not finish.
Any suggestion?
Check for locks and blocking statements:
https://wiki.postgresql.org/wiki/Lock_Monitoring
<https://wiki.postgresql.org/wiki/Lock_Monitoring>
You can delete in chunks like this:
do $_$ declare num_rows bigint; begin loop delete from YourTable where id in
(select id from YourTable where id < 500 limit 100); get diagnostics num_rows =
row_count; raise notice 'deleted % rows', num_rows; exit when num_rows = 0; end
loop; end;$_$;
--
Andreas Joseph Krogh
From | Date | Subject | |
---|---|---|---|
Next Message | Thomas Munro | 2019-11-13 00:45:59 | Re: here does postgres take its timezone information from? |
Previous Message | Rob Sargent | 2019-11-12 23:04:16 | Re: DELETE Query Hang |