From: | Chris Ruprecht <chris(at)cdrbill(dot)com> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | Importing binary data |
Date: | 2014-10-27 19:46:23 |
Message-ID: | B99FC604-79E2-4E95-9FF0-EC6CC395E583@cdrbill.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Hey guys,
I was given a database back of a non-PostgreSQL database. That database contains records where some binary file (looks like email attachments) was imported into several chunks of X characters in length and then stored into multiple records. A messy way of storing BLOB data. The database encoding is LATIN1, ISO8859-1.
There chunks are actually 50 fields of 60 bytes each per row. If the original file is larger than that, more than one row is used.
I can export the data out of that database into flat files just fine, but then I try to import the data to Postgres, I'm getting errors like this:
ERROR: invalid byte sequence for encoding "SQL_ASCII": 0x00
CONTEXT: COPY attachments, line 14: "58025 1 cl\Cert.r 10 M04P'15A415).($-H87)4:6UE+$-(05)!0U1%4BQ)3E!55"!I5&EM92!)3E1% M1T52'$585$523B!7..."
I tried LATIN1, SQL_ASCII, UTF-8, nothing works. I even tried to make the data type 'bytea', no luck. I'd love to have a "NO-CONVERSION" option on the copy command that just takes what ever bytes come along and doesn't try to interpret them.
Any ideas of what I can do to import this stuff?
best regards,
chris
--
chris ruprecht
database grunt and bit pusher extraordinaíre
From | Date | Subject | |
---|---|---|---|
Next Message | David G Johnston | 2014-10-27 19:57:17 | Re: Importing binary data |
Previous Message | basti | 2014-10-27 19:22:31 | Re: pg killed by oom-killer, "invalid contrecord length 2190 at A6C/331AAA90" on slaves |