From: | Rick Casey <caseyrick(at)gmail(dot)com> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: optimizing import of large CSV file into partitioned table? |
Date: | 2010-03-29 14:55:13 |
Message-ID: | ebcc3991003290755l735ee0e2tfa4ecc4b4ac94527@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Thanks Dim; I was not aware of pgloader. This, and the other suggestions,
have helped a lot; thanks everyone.
--rick
On Mon, Mar 29, 2010 at 7:41 AM, Dimitri Fontaine <dfontaine(at)hi-media(dot)com>wrote:
> Rick Casey <caseyrick(at)gmail(dot)com> writes:
>
> > So, I am wondering if there is any to optimize this process? I have been
> using Postgres for several years, but have never had to partition or
> optimize it for files
> > of this size until now.
> > Any comments or suggestions would be most welcomed from this excellent
> forum.
>
> The pgloader tool will import your data as batches of N lines, you get
> to say how many lines you want to consider in each transaction. Plus,
> you can have more than one python thread importing your big file, either
> sharing one writer and having the other threads doing the parsing and
> COPY, or having N independent threads doing the reading/parsing/COPY.
>
> http://pgloader.projects.postgresql.org/
>
> Hope this helps,
> --
> dim
>
--
----------------------------------------------------------------------------
Rick Casey :: caseyrick(at)gmail(dot)com :: 303.345.8893
From | Date | Subject | |
---|---|---|---|
Next Message | Tom Lane | 2010-03-29 15:00:34 | Re: Splitting text column to multiple rows |
Previous Message | Tom Lane | 2010-03-29 14:52:20 | Re: How to give security to pg_catalogs |