Re: import CSV file to a table

From: Bret Stern <bret_stern(at)machinemanagement(dot)com>
To: Karl Czajkowski <karlcz(at)isi(dot)edu>
Cc: Rob Sargent <robjsargent(at)gmail(dot)com>, pgsql-general(at)postgresql(dot)org
Subject: Re: import CSV file to a table
Date: 2017-03-08 16:54:49
Message-ID: 1488992089.3126.8.camel@bret.machinemanagement.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

I'll throw in.

If tab delimited is available, perhaps that option will work
better...or..
use Access to find the violations of the quote comma delimited
assumptions, then
export from Access an import

Bret

On Wed, 2017-03-08 at 08:36 -0800, Karl Czajkowski wrote:

> I believe that in its fully glory, you cannot reliably locate CSV
> record boundaries except by parsing each field in order including
> quote processing. Individual records may have arbitrary numbers of
> field and record separator characters within the values.
>
> Karl
>
>
> On Mar 08, Rob Sargent modulated:
> > Since bash has been bandied about in this thread I presume awk is
> > available. Here's how I would check just how 'csv'ish the incoming
> > file is.
> > ...
>
>

In response to

Browse pgsql-general by date

  From Date Subject
Next Message Rob Sargent 2017-03-08 17:00:03 Re: import CSV file to a table
Previous Message Karl Czajkowski 2017-03-08 16:52:44 Re: import CSV file to a table