From: | "David G(dot) Johnston" <david(dot)g(dot)johnston(at)gmail(dot)com> |
---|---|
To: | Karl Czajkowski <karlcz(at)isi(dot)edu> |
Cc: | John McKown <john(dot)archie(dot)mckown(at)gmail(dot)com>, vinny <vinny(at)xs4all(dot)nl>, Günce Kaya <guncekaya14(at)gmail(dot)com>, PostgreSQL General <pgsql-general(at)postgresql(dot)org>, "pgsql-general-owner(at)postgresql(dot)org" <pgsql-general-owner(at)postgresql(dot)org> |
Subject: | Re: import CSV file to a table |
Date: | 2017-03-08 16:21:16 |
Message-ID: | CAKFQuwbur63EfDN=rf6S_DKUVgyHrEp9yMJmovTQDCWkyEeJ8w@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On Wed, Mar 8, 2017 at 9:13 AM, Karl Czajkowski <karlcz(at)isi(dot)edu> wrote:
>
> With the temporary table, you can use SQL for most validation or data
> interrogation, but you need to know at least enough schema information
> in advance to form the COPY statement. Parsing the CSV header row to
> plan your work puts you right back to requiring a robust CSV parser
> unless you can constrain your input scenarios to only handle very
> trivial headers.
>
You can write the entire contents of the CSV into a psql variable and
process the text blob from there using intermediate arrays.
David J.
From | Date | Subject | |
---|---|---|---|
Next Message | Rob Sargent | 2017-03-08 16:24:19 | Re: import CSV file to a table |
Previous Message | Adrian Klaver | 2017-03-08 16:18:59 | Re: Unable to start postgresql |