From: | doganmeh <mehmet(at)edgle(dot)com> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: COPY: row is too big |
Date: | 2017-05-26 12:07:47 |
Message-ID: | 1495800467776-5963385.post@n3.nabble.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
I am piggy-backing in this thread because I have the same issue as well. I
need to import a csv file that is 672 columns long and each column consists
of 12 alpha-numeric characters. Such as:
SA03ARE1015D SA03ARE1S15N SB03ARE1015D ...
356412 275812 43106 ...
I am aware this is not normalized, however, we (or try to) keep source data
intact, and normalize after importing into our system.
While trying to import all columns to type `text` I get this error:
[54000] ERROR: row is too big: size 8760, maximum size 8160
Where: COPY temp_table, line 3
SQL statement "copy temp_table from
'/home/edgleweb/data/raw/TX/TAPR/2015/ADV/SSTAAR1ADV.csv' with delimiter ','
quote '"' csv "
I tried varchar(12) also, nothing changed. My questions is 1) I have
672x12=8,064 characters in the first row (which are actually the headers),
why would it complain that it is 8760. I am assuming here type `text`
occupies 1 byte for a character. 2) Is there anything I can do to work
around this situation?
Thanks in advance.
--
View this message in context: http://www.postgresql-archive.org/COPY-row-is-too-big-tp5936997p5963385.html
Sent from the PostgreSQL - general mailing list archive at Nabble.com.
From | Date | Subject | |
---|---|---|---|
Next Message | doganmeh | 2017-05-26 12:09:10 | Re: COPY: row is too big |
Previous Message | Jayadevan M | 2017-05-26 04:55:20 | Re: Inheritance and foreign keys |