Re: Importing *huge* mysql database into pgsql

From: "Webb Sprague" <webb(dot)sprague(at)gmail(dot)com>
To: (dot)ep <erick(dot)papa(at)gmail(dot)com>
Cc: pgsql-general(at)postgresql(dot)org
Subject: Re: Importing *huge* mysql database into pgsql
Date: 2007-03-06 14:39:11
Message-ID: b11ea23c0703060639k4adb506bgd7b49d4d803c0edc@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

> I would like to convert a mysql database with 5 million records and
> growing, to a pgsql database.
>
> All the stuff I have come across on the net has things like
> "mysqldump" and "psql -f", which sounds like I will be sitting forever
> getting this to work.

Have you tried it? 5 million rows seem doable. In postgres, make
sure you disable indexes and checks when you do your import, and use
the bulk copy.

How long is forever? Can you go offline? If you only need to do it
once, it probably won't be too painful

W

In response to

Browse pgsql-general by date

  From Date Subject
Next Message Korin Richmond 2007-03-06 14:40:53 Re: plpythonu and PYTHONPATH/sys.path
Previous Message .ep 2007-03-06 14:24:26 Importing *huge* mysql database into pgsql