Re: MSSQL to PostgreSQL : Encoding problem

From: "Magnus Hagander" <mha(at)sollentuna(dot)net>
To: "Arnaud Lesauvage" <thewild(at)freesurf(dot)fr>, "Richard Huxton" <dev(at)archonet(dot)com>
Cc: <pgsql-general(at)postgresql(dot)org>, <tony_caduto(at)amsoftwaredesign(dot)com>
Subject: Re: MSSQL to PostgreSQL : Encoding problem
Date: 2006-11-22 14:01:27
Message-ID: 6BCB9D8A16AC4241919521715F4D8BCEA3596B@algol.sollentuna.se
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

> >> I already posted this as "COPY FROM encoding error", but I
> have been
> >> doing some more tests since then.
> >>
> >> I'm trying to export data from MS SQL Server to PostgreSQL.
> >> The tables are quite big (>20M rows), so a CSV export and a "COPY
> >> FROM3 import seems to be the only reasonable solution.
> >
> > Or go via MS-Access/Perl and ODBC/DBI perhaps?
>
> Yes, I think it would work. The problem is that the DB is too
> big for this king of export. Using DTS from MSSQL to export
> directly to PostgreSQL using psqlODBC Unicode Driver, I
> exported ~1000 rows per second in a 2-columns table with ~20M
> rows. That means several days just for this table, and I have
> bigger ones !
>

Interesting. What did you set the "Inser batch size" to? (I think that's
available for all transformatino tasks). And did you remember to check
the box for "use transactions"?

While it's never as fast as a COPY, it should be possible to make it
faster than that, Ithink.

Another option is to just BCP the file out, and then COPY it into
postgresql. No nice GUI, but you can tune almost everything with BCP.

//Magnus

In response to

Browse pgsql-general by date

  From Date Subject
Next Message Arnaud Lesauvage 2006-11-22 14:08:50 Re: MSSQL to PostgreSQL : Encoding problem
Previous Message Magnus Hagander 2006-11-22 13:58:59 Re: MSSQL to PostgreSQL : Encoding problem