Re: Getting Mysql data into Postgres: least painful methods?

From: Ken Tanzer <ken(dot)tanzer(at)gmail(dot)com>
To: Adrian Klaver <adrian(dot)klaver(at)gmail(dot)com>
Cc: pgsql-general(at)postgresql(dot)org
Subject: Re: Getting Mysql data into Postgres: least painful methods?
Date: 2013-01-16 00:41:44
Message-ID: CAD3a31UhEFWdMX0cAkFdwvvyiD8etjR++_Dyuz0AMt4nvdwP0A@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

>
> 8.4 supports FDW so I will assume you do not have the permissions to
> create one.
>

Thanks but I'm confused--the doc page you mentioned says the mysql FDW
isn't supported until 9.1.

>
>
>
>> In this case it looks like 24 tables, with CSV-like import files
>> totaling 7G.
>>
>> Since there didn't seem to be a clean, simple and automated path from
>> mysql to postgres, I'm back to skipping mysql entirely and just trying
>> to modify the mysql files to feed directly into postgres.
>>
>> To that end, they have to be transformed a bit, which I've written a bit
>> of script to accomplish. I'm wondering if there's a way to avoid
>> creating another 7G of slightly-modified import files before feeding
>> them to postgres. Specifically, is there a way to do something like
>>
>> \copy my_table FROM '`cat my_import_file | my_transform_script`'
>>
>
>
> The way I have done this is to create a script using Python that follows
> this flow:
>
> MySQL --> MySQLdb module --> Data transform --> psycopg2 --> Postgres
> OR
> csv --> csv module ------^
>
>
> In the script you can set up the transactions as you like, per row,
> batches, or everything in one transaction.
>
>
> I'm also not sure about this. Are we both talking about a process that
bypasses mysql itself, and transforms its input files on the fly? I want
to write a script that will...

#!/bin/sh
... my script stuff...
... create tables...
psql \copy from (transformed mysql file 1)
psql \copy from (transformed mysql file 2)
psql \copy from (transformed mysql file 3)
... more script stuff...

without re-writing the mysql files, and within one transaction. I can't
tell if your answer was getting at that, or something else.

Ken

>
>
>

>> My 2 goals here are to be somewhat efficient (by not duplicating the
>> input files), and to keep this all within a transaction. I could have
>> the script transform each file separately and pipe it to postgres:
>>
>> (echo 'copy mytable from stdin...' ; cat my_import_file |
>> my_transform_script ) | psql
>>
>> but I'm thinking that there's no way to group those all into a
>> transaction.
>>
>> Hopefully this makes sense, and any suggestions welcome. Thanks.
>>
>> Ken
>>
>
>
>
> --
> Adrian Klaver
> adrian(dot)klaver(at)gmail(dot)com
>

--
AGENCY Software
A data system that puts you in control
*http://agency-software.org/*
ken(dot)tanzer(at)agency-software(dot)org
(253) 245-3801

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Rob Sargent 2013-01-16 00:45:19 Re: SELECT * and column ordering
Previous Message Adrian Klaver 2013-01-16 00:28:13 Re: Getting Mysql data into Postgres: least painful methods?