From: | "Dave [Hawk-Systems]" <dave(at)hawk-systems(dot)com> |
---|---|
To: | <pgsql-general(at)postgresql(dot)org> |
Subject: | Re: populate table with large csv file |
Date: | 2003-09-26 11:58:16 |
Message-ID: | DBEIKNMKGOBGNDHAAKGNIENDFBAC.dave@hawk-systems.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general pgsql-performance |
>> aside from parsing the csv file through a PHP interface, what isthe
>easiest way
>> to get that csv data importted into the postgres database. thoughts?
>
>Assuming the CSV file data is well formed, use psql and
>the COPY command.
>
>In psql, create the table. Then issue command:
>
>copy <tablename> from 'filename' using delimiters ',';
perfect solution that was overlooked.
Unfortunately processing the 143mb file which would result in a database size of
approx 500mb takes an eternity. As luck would have it we can get away with just
dropping to an exec and doing a cat/grep for any data we need... takes 2-3
seconds.
the copy command is definately a keeper as I am not looking at replacing code
elsewhere with a simpler model using that.
Thanks
Dave
From | Date | Subject | |
---|---|---|---|
Next Message | Ron Johnson | 2003-09-26 12:37:35 | Re: populate table with large csv file |
Previous Message | Angshuman Basu | 2003-09-26 11:53:07 | query |
From | Date | Subject | |
---|---|---|---|
Next Message | Ron Johnson | 2003-09-26 12:37:35 | Re: populate table with large csv file |
Previous Message | rantunes | 2003-09-26 10:57:28 | Re: Indices arent being used |