From: | Alvaro Herrera <alvherre(at)alvh(dot)no-ip(dot)org> |
---|---|
To: | Logan Bowers <logan(at)zillow(dot)com> |
Cc: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: Clog/Transaction problems |
Date: | 2005-08-05 21:53:24 |
Message-ID: | 20050805215324.GB14653@alvh.no-ip.org |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On Fri, Aug 05, 2005 at 04:52:29PM -0400, Logan Bowers wrote:
> Hello,
>
>
>
> I have a really weird problem; queries against a very large table are
> failing against a very large table with a strange error. Case in point:
>
>
>
> dqfull=# vacuum freeze mytable;
>
> ERROR: could not access status of transaction 538989714
>
> DETAIL: could not open file "/srv/db/postgresql/pg_clog/0202": No such
> file or directory
Looks like a flipped bit. 538989714 is 100000001000000101010010010010.
Have you checked your RAM?
> WTF? The only activity this table has seen is a massive data import of
> around ~40M rows. Is there a way to fix clog info and make it think all
> transactions on it have committed?
You could fill the pg_clog/0202 file with 0x55, which means "all
transactions committed". You'd also have to change the nextXid counter
to beyond what the 0202 file records (pg_resetxlog can do that for you),
in order not to have problems in the future.
However, if bad RAM is the issue, you'll have other problems in the
future if not fixed.
.
--
Alvaro Herrera (<alvherre[a]alvh.no-ip.org>)
"Pensar que el espectro que vemos es ilusorio no lo despoja de espanto,
sólo le suma el nuevo terror de la locura" (Perelandra, CSLewis)
From | Date | Subject | |
---|---|---|---|
Next Message | Yudie Pg | 2005-08-05 22:06:24 | Re: How to join function with a table? |
Previous Message | Ragnar Hafstað | 2005-08-05 21:36:49 | Re: How to join function with a table? |