From: | "Thomas H(dot)" <me(at)alternize(dot)com> |
---|---|
To: | "Arnaud Lesauvage" <thewild(at)freesurf(dot)fr>, "Richard Huxton" <dev(at)archonet(dot)com> |
Cc: | <pgsql-general(at)postgresql(dot)org>, <tony_caduto(at)amsoftwaredesign(dot)com> |
Subject: | Re: MSSQL to PostgreSQL : Encoding problem |
Date: | 2006-11-22 14:36:03 |
Message-ID: | 012001c70e43$8a85f120$0201a8c0@iwing |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
>>>> Or go via MS-Access/Perl and ODBC/DBI perhaps?
>>>
>>> Yes, I think it would work. The problem is that the DB is too big for
>>> this king of export. Using DTS from MSSQL to export directly to
>>> PostgreSQL using psqlODBC Unicode Driver, I exported ~1000 rows per
>>> second in a 2-columns table with ~20M rows. That means several days just
>>> for this table, and I have bigger ones !
>>
>> Well it's about 0.25 days, but if it's too long, it's too long.
>
> Sure, sorry for the confusion, the problem is with the other tables (same
> number of rows but a lot of columns, some very large).
>
well, if its too slow, then you will have to dump the db to a textfile (DTS
does this for you) and then convert the textfile to utf8 manually before
importing it to pgsql. iconv for win32 will help you there. i found tho it
removes some wanted special characters, so watch out.
a less "scientific" approach would be using an unicode-aware texteditor to
convert it (ultraedit does this pretty nicely, for example). have had good
results with it.
loading several million rows will always take some time, tho.
- thomas
From | Date | Subject | |
---|---|---|---|
Next Message | Tom Lane | 2006-11-22 15:02:19 | Re: build for none standard socket |
Previous Message | Arnaud Lesauvage | 2006-11-22 14:34:34 | Re: MSSQL to PostgreSQL : Encoding problem |