From: | Jeremy Buchmann <jeremy(at)wellsgaming(dot)com> |
---|---|
To: | Ted Rolle <ted(at)tvlconn(dot)com>, "'pgsql-admin'" <pgsql-admin(at)postgresql(dot)org> |
Subject: | Re: Fast load |
Date: | 2001-08-24 22:50:51 |
Message-ID: | B7AC285A.37B7%jeremy@wellsgaming.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-admin |
> We have 73 databases, two dozen with hundreds of thousands to millions of
> records, with lengths in the 500-byte range. I'm planning to convert them
> from Btrieve to PostgreSQL.
>
> Of course, I want the highest reasonable speed so that the conversion can be
> completed - say - in a week-end.
>
> My initial take is to create a tab-delimited file and use this as input to
> the \copy command in psql.
>
> Another option might be to feed them directly to the back end.
>
> I'm a C programmer, so writing the necessary programs for the conversion and
> load is not a problem.
>
> Any pointers?
Use \copy. It's very fast at importing tab-delimited text files. Just make
sure that what you export from the old database is very "clean", meaning no
quotes around text fields, no tabs within a field, that sort of thing. 73
is a lot of databases, but it will probably take you more time to actually
type the \copy commands than it will for postgres to copy the data.
-- Jeremy [jeremy(at)wellsgaming(dot)com]
From | Date | Subject | |
---|---|---|---|
Next Message | Ted Rolle | 2001-08-24 23:00:15 | RE: Fast load |
Previous Message | Ted Rolle | 2001-08-24 21:49:32 | Fast load |