From: | Patrick Hatcher <pathat(at)comcast(dot)net> |
---|---|
To: | Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us> |
Cc: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: What encoding to use for this error? |
Date: | 2005-04-06 18:56:38 |
Message-ID: | 425430E6.5010602@comcast.net |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Thank you. I'll take a look at our data export function.
Tom Lane wrote:
>Patrick Hatcher <pathat(at)comcast(dot)net> writes:
>
>
>>We're testing moving our data to UNICODE from LATIN1, but when I try to
>>import my data, I get the following error:
>>
>>
>
>
>
>>DBD::Pg::st execute failed: ERROR: Unicode characters greater than or
>>equal to
>>0x10000 are not supported
>>CONTEXT: COPY bcp_mdc_products, line 120, column description: "LladrĂ³
>>"Ducks in
>> a Basket""
>>
>>
>
>Do you have client_encoding set properly while doing the import?
>If the incoming data is LATIN1, you need to say so, so that the backend
>knows how to convert it to UNICODE.
>
> regards, tom lane
>
>
>
From | Date | Subject | |
---|---|---|---|
Next Message | Martijn van Oosterhout | 2005-04-06 18:57:44 | Re: Big trouble with memory !! |
Previous Message | Otto Blomqvist | 2005-04-06 18:53:45 | Problems with Set Returning Functions (SRFs) |