From: | Deepblues <deepblues(at)gmail(dot)com> |
---|---|
To: | KeithW(at)narrowpathinc(dot)com |
Cc: | pgsql-novice(at)postgresql(dot)org |
Subject: | Re: Import csv file into multiple tables in Postgres |
Date: | 2005-03-03 20:10:58 |
Message-ID: | bd6cfb9e05030312106a92d646@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-novice |
Thanks for all that information, My scenario looks like this ...
I need to import an excel spreadsheet into the postgresql database .
I converted the excel spreadsheet into a csv file and now I have 3
tables in my database where I need to import this data. Im new to both
perl and postgres. do you have any sample script that I can have a
look at which loads the csv file into a interface table ?
Deepblues
On Tue, 01 Mar 2005 12:06:34 -0500, Keith Worthington
<KeithW(at)narrowpathinc(dot)com> wrote:
> Sean Davis wrote:
>
> >
> > ----- Original Message ----- From: "Andrew Hammond"
> > <ahammond(at)ca(dot)afilias(dot)info>
> > To: "Deepblues" <deepblues(at)gmail(dot)com>
> > Cc: <pgsql-novice(at)postgresql(dot)org>
> > Sent: Sunday, February 27, 2005 9:28 PM
> > Subject: Re: [NOVICE] Import csv file into multiple tables in Postgres
> >
> >
> >> The brief answer is no, you can not import from a single csv file
> >> into multiple tables.
> >>
> >> If the csv file consists of two distinct sections of data, then you
> >> could of course split it into two csv files. If what you want to do
> >> is normalize existing data, then you should first import the existing
> >> data into a working table. Then you can manipulate it within the
> >> database.
> >>
> >> It is unlikely that you will need perl to do any of this.
> >
> >
> > I use perl a lot for stuff like this, but have found that in most
> > cases, the easiest thing to do is to load the data into a single
> > postgresql table and then create sql for doing the selects and inserts
> > to then create the multiple tables. This has the added advantage that
> > you get to keep a copy of the original data available in case you
> > don't put every column into the "working" database. If you end up
> > doing this a lot, you can create a separate "loader" schema that
> > contains all of these raw csv tables in one place, not visible by most
> > users so as not to confuse the "working" schema.
> >
> > Sean
>
> I do nearly exactly as Sean has suggested all the time.
>
> We have a schema called data_transfer that contains all of the tables
> for accepting data. (As an aside the data is written into these tables
> by a shell script using the COPY command.) Each of the tables has a
> trigger that fires and executes a function. The function is responsible
> for taking the data out of the data_transfer table and putting it in
> one, two or three target tables. In addition to the advantage noted by
> Sean you can also manipulate the data during this process as well as
> check for duplicates dynamically change between INSERT and UPDATE
> commands, etc, etc.
>
> --
> Kind Regards,
> Keith
>
>
--
Deepti Mandava
Graduate Teaching Assistant,
The University of Kansas
Lawrence, KS 66045-7546
From | Date | Subject | |
---|---|---|---|
Next Message | Sean Davis | 2005-03-03 20:48:23 | Re: Import csv file into multiple tables in Postgres |
Previous Message | operationsengineer1 | 2005-03-03 17:06:52 | Re: Blasted Errors on everything. |