Brian Cox <brian(dot)cox(at)ca(dot)com> writes:
> I changed the logic to update the table in 1M row batches. However,
> after 159M rows, I get:
> ERROR: could not extend relation 1663/16385/19505: wrote only 4096 of
> 8192 bytes at block 7621407
You're out of disk space.
> A df run on this machine shows plenty of space:
Per-user quota restriction, perhaps?
I'm also wondering about temporary files, although I suppose 100G worth
of temp files is a bit much for this query. But you need to watch df
while the query is happening, rather than suppose that an after-the-fact
reading means anything.
regards, tom lane