Re: create a script which imports csv data

From: Craig Ringer <ringerc(at)ringerc(dot)id(dot)au>
To: Robert Buckley <robertdbuckley(at)yahoo(dot)com>
Cc: "pgsql-general(at)postgresql(dot)org" <pgsql-general(at)postgresql(dot)org>
Subject: Re: create a script which imports csv data
Date: 2012-06-29 03:01:37
Message-ID: 4FED1A91.5010708@ringerc.id.au
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

On 06/28/2012 07:53 PM, Robert Buckley wrote:
> Hi,
>
> I have to create a script which imports csv data into postgresql
> ...and have a few questions about the best way to do it.

The advice already given is pretty good. Remember you can always create
a clean new table then INSERT INTO ... SELECT to populate it from a
scratch table you loaded your CSV into, so you don't have to do your
cleanups/transformations to the CSV or during the COPY its self.

If it's a big job, it's going to be regular, you're going to have to
merge it with more imports later, etc, consider an ETL tool like
Penatho. http://kettle.pentaho.com/

For very very fast loading of bulk data, consider pg_bulkload
http://pgbulkload.projects.postgresql.org/ . It's only worth the hassle
if your load will take many, many hours without it.

--
Craig Ringer

In response to

Browse pgsql-general by date

  From Date Subject
Next Message tuanhoanganh 2012-06-29 07:15:50 Re: Postgresql 9.0.6 alway run VACUUM ANALYZE pg_catalog.pg_attribute
Previous Message Jeff Davis 2012-06-28 18:34:11 Re: Postgresql 9.0.6 alway run VACUUM ANALYZE pg_catalog.pg_attribute