From: | Richard Huxton <dev(at)archonet(dot)com> |
---|---|
To: | "(dot)ep" <erick(dot)papa(at)gmail(dot)com> |
Cc: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: Importing *huge* mysql database into pgsql |
Date: | 2007-03-06 14:54:02 |
Message-ID: | 45ED808A.1060007@archonet.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
.ep wrote:
> Hello,
>
> I would like to convert a mysql database with 5 million records and
> growing, to a pgsql database.
And where's the huge database?
> All the stuff I have come across on the net has things like
> "mysqldump" and "psql -f", which sounds like I will be sitting forever
> getting this to work.
Well, there's not much of an alternative to exporting from one system
and importing to another. If you do find a better way, patent it!
This is probably a sensible place to start for converting schemas:
http://pgfoundry.org/projects/mysql2pgsql
Then, you'll face two problems:
1. Invalid data in your mysql dump (e.g. dates like 0000-00-00)
2. Mysql-specific usage in your application
Then you might want to examine any performance issues (where your
application code has been tuned to work well with MySQL but not
necessarily PG).
Shouldn't be more than a day's work, maybe just 1/2 a day. I like to
build these things up as sets of perl scripts. That way when I notice
"one more thing" I can re-run my scripts from wherever the problem was.
Oh - if you come up with any improvements in mysql2pgsql then let the
developers know - I'm sure they'll be interested.
Good luck!
--
Richard Huxton
Archonet Ltd
From | Date | Subject | |
---|---|---|---|
Next Message | Korin Richmond | 2007-03-06 15:51:57 | Re: plpythonu and PYTHONPATH/sys.path |
Previous Message | Csaba Nagy | 2007-03-06 14:43:49 | Re: Importing *huge* mysql database into pgsql |