From: | Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us> |
---|---|
To: | s_hawkins(at)mindspring(dot)com (S(dot) Hawkins) |
Cc: | pgsql-hackers(at)postgresql(dot)org |
Subject: | Re: restore of large databases failing--any ideas? |
Date: | 2004-04-09 04:09:49 |
Message-ID: | 20719.1081483789@sss.pgh.pa.us |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-hackers |
s_hawkins(at)mindspring(dot)com (S. Hawkins) writes:
> * We're running Postgres 7.2.3 on a more-or-less stock Red Hat 7.3
> platform.
Both the database and the platform are seriously obsolete :-(
> The particular file I'm wrestling with at the moment is ~2.2 Gig
> unzipped. If you try to restore using pg_restore, the process
> immediately fails with the following:
> pg_restore: [archiver] could not open input file: File too large
It appears that you're working with a pg_restore binary that doesn't
support access to files larger than 2G. This is mostly an issue of what
the platform's libc can handle; and on many platforms it depends on
build or link options. I no longer recall whether RH 7.3 supported
largefile access at all, let alone what build-time pushups were needed
to make it happen if it could happen.
My recommendation would be to get hold of a current PG version, dump
using the current version's pg_dump, then install and reload into the
current version.
regards, tom lane
From | Date | Subject | |
---|---|---|---|
Next Message | Brett Schwarz | 2004-04-09 04:27:56 | Re: postgres/pgtcl & windows |
Previous Message | Tom Lane | 2004-04-09 03:25:01 | Re: locale |