| From: | Harald Fuchs <hf0217x(at)protecting(dot)net> |
|---|---|
| To: | pgsql-general(at)postgresql(dot)org |
| Subject: | Re: Importing *huge* mysql database into pgsql |
| Date: | 2007-03-06 17:34:15 |
| Message-ID: | pur6s2mg2w.fsf@srv.protecting.net |
| Views: | Whole Thread | Raw Message | Download mbox | Resend email |
| Thread: | |
| Lists: | pgsql-general |
In article <1173191066(dot)416664(dot)320470(at)n33g2000cwc(dot)googlegroups(dot)com>,
".ep" <erick(dot)papa(at)gmail(dot)com> writes:
> Hello,
> I would like to convert a mysql database with 5 million records and
> growing, to a pgsql database.
> All the stuff I have come across on the net has things like
> "mysqldump" and "psql -f", which sounds like I will be sitting forever
> getting this to work.
> Is there anything else?
If you really want to convert a *huge* MySQL database (and not your
tiny 5M record thingie), I'd suggest "mysqldump -T". This creates for
each table an .sql file containing just the DDL, and a .txt file
containing the data.
Then edit all .sql files:
* Fix type and index definitions etc.
* Append a "COPY thistbl FROM 'thispath/thistbl.txt';"
Then run all .sql files with psql, in an order dictated by foreign keys.
| From | Date | Subject | |
|---|---|---|---|
| Next Message | Jeff Ross | 2007-03-06 17:37:22 | Re: Postgres Mailing List management solution |
| Previous Message | Laurent ROCHE | 2007-03-06 17:02:47 | Re : Importing *huge* mysql database into pgsql |