From: | Chris <dmagick(at)gmail(dot)com> |
---|---|
To: | "(dot)ep" <erick(dot)papa(at)gmail(dot)com> |
Cc: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: Importing *huge* mysql database into pgsql |
Date: | 2007-03-07 01:15:58 |
Message-ID: | 45EE124E.2030007@gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
.ep wrote:
> Hello,
>
> I would like to convert a mysql database with 5 million records and
> growing, to a pgsql database.
>
> All the stuff I have come across on the net has things like
> "mysqldump" and "psql -f", which sounds like I will be sitting forever
> getting this to work.
If you can convert the database schema, then in mysql do a dump of the
tables like this:
select * from table into outfile '/tmp/filename';
(see http://dev.mysql.com/doc/refman/4.1/en/select.html)
and then import it into postgres like this:
\copy table from '/tmp/filename'
(see http://www.postgresql.org/docs/8.2/interactive/sql-copy.html)
That's much better because it creates a CSV like file which postgres can
process in one go.
Using complete inserts to do a conversion is horribly slow because
postgres does a single transaction per insert - so you can either wrap a
number of inserts inside a transaction, or do a copy like this (copy is
best).
--
Postgresql & php tutorials
http://www.designmagick.com/
From | Date | Subject | |
---|---|---|---|
Next Message | Bruno Wolff III | 2007-03-07 03:48:26 | Re: Can I getting a unique ID from a select |
Previous Message | Ed L. | 2007-03-06 23:18:00 | Re: vacuum error |