From: | Mike Charnoky <noky(at)nextbus(dot)com> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | pg_restore and large files |
Date: | 2004-02-05 14:23:23 |
Message-ID: | 402251DB.4010305@nextbus.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Hello,
I am currently using PostgreSQL v7.3.4 on a RedHat 8.0 system (2.4.23 kernel)
using the ext3 filesystem. I am experiencing problems when performing a
pg_restore using a file which is 2.3G in size. The dump, which seemed to run
smoothly, was created using the -Fc option. When I perform the restore, the
following error occurs before the pg_restore fails:
pg_restore: [custom archiver] error during file seek: Invalid argument
pg_restore: *** aborted because of error
Why is this happening? The error comes from pg_backup_custom.c, it seems that
an fseeko() is failing (even though this is the way to support large files). It
is my understanding that ext3 supports file sizes up to 1T. The restore worked
fine when the database was smaller. Any ideas?
Thanks,
Mike Charnoky
From | Date | Subject | |
---|---|---|---|
Next Message | Mark Gibson | 2004-02-05 14:46:57 | dblink - custom datatypes don't work |
Previous Message | Marc A. Leith | 2004-02-05 13:29:39 | Re: Predictive or scoring solution for PostgreSQL ? |