From: | Harald Fuchs <nospam(at)sap(dot)com> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: migrate from postgres to mysql |
Date: | 2003-10-06 13:32:49 |
Message-ID: | puy8vyebjy.fsf@srv.protecting.net |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
In article <1065126701(dot)1431(dot)8(dot)camel(at)localhost(dot)localdomain>,
Scott Cain <cain(at)cshl(dot)org> writes:
> Well, I've not done it, but you could do a
> pg_dump -s dbname >schema.sql
> pg_dump -d -a dbname >data.sql
> to get just the schema in one file and the data in inserts in another
> file. Then you could use a perl script driven by SQL::Translator (check
> http://www.cpan.org) to translate the schema from Pg to MySQL. Create
> the schema in MySQL, then load via the inserts.
I'd replace the second call of pg_dump by a "COPY mytbl TO 'mytbl.txt'"
for each table in the DB and import it into MySQL with "LOAD DATA
[LOCAL] INFILE". This would be much faster than INSERTing.
From | Date | Subject | |
---|---|---|---|
Next Message | Shridhar Daithankar | 2003-10-06 13:39:47 | Re: PITR (was Re: Type of application that use |
Previous Message | Shridhar Daithankar | 2003-10-06 13:32:35 | Re: Server recommendations |