Re: Issue with Running VACUUM on Database with Large Tables

From: Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us>
To: Nagaraj Raj <nagaraj(dot)sf(at)yahoo(dot)com>
Cc: pgsql-bugs(at)lists(dot)postgresql(dot)org
Subject: Re: Issue with Running VACUUM on Database with Large Tables
Date: 2023-12-25 14:53:25
Message-ID: 1843284.1703516005@sss.pgh.pa.us
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-bugs

Nagaraj Raj <nagaraj(dot)sf(at)yahoo(dot)com> writes:
> While executing a vacuum analyze on our database containing large tables (approximately 200k), I encountered an issue. If a table gets dropped during the vacuum process, the vacuum job fails at that point with an error message stating "OID relation is not found" and exits.

I can't replicate that. I get either

ERROR: relation "foo" does not exist

if you specifically name a nonexistent table, or

WARNING: skipping vacuum of "foo" --- relation no longer exists

if the table existed at the start but doesn't exist by the time
vacuum gets to it. There may be some code path that results in
the error you cite, but you'll need to provide more detail about
how to duplicate it.

regards, tom lane

In response to

Browse pgsql-bugs by date

  From Date Subject
Next Message Tom Lane 2023-12-25 15:58:59 Re: BUG #18240: Undefined behaviour in cash_mul_flt8() and friends
Previous Message Nagaraj Raj 2023-12-25 12:10:40 Issue with Running VACUUM on Database with Large Tables