| From: | "David G(dot) Johnston" <david(dot)g(dot)johnston(at)gmail(dot)com> |
|---|---|
| To: | Rama Krishnan <raghuldrag(at)gmail(dot)com> |
| Cc: | Postgres General <pgsql-general(at)postgresql(dot)org> |
| Subject: | Re: Value Too long varchar(100) |
| Date: | 2022-10-27 12:18:40 |
| Message-ID: | CAKFQuwb211JjQTr-XCy9zng1auU09xp4-jGyaAToY7eZLy4i0Q@mail.gmail.com |
| Views: | Whole Thread | Raw Message | Download mbox | Resend email |
| Thread: | |
| Lists: | pgsql-general |
On Thu, Oct 27, 2022 at 5:02 AM Rama Krishnan <raghuldrag(at)gmail(dot)com> wrote:
> Hi team,
>
> We are getting csv file from client to upload data in my db table , one
> particular column I. E clinet description column contains more than 100
> character hence I am getting value too long varchar (100) so we decided to
> upload db only first 100 characters. How to use this thing in copy command
>
You cannot. Either fix the content of the file or remove the arbitrary
length limitation on the field (i.e., change the type to "text"). I
suggest the later. You may also,copy into a temporary staging table that
lacks the limit, then use insert to move the transformed data (via a select
query) into the production table.
David J.
| From | Date | Subject | |
|---|---|---|---|
| Next Message | Stefan Knecht | 2022-10-27 12:48:37 | Re: Value Too long varchar(100) |
| Previous Message | Kristjan Mustkivi | 2022-10-27 12:07:06 | Re: Index corruption revealed after upgrade to 11.17, could date back to at least 11.12 |