From: | Richard Huxton <dev(at)archonet(dot)com> |
---|---|
To: | frank church <pgsql(at)adontendev(dot)net> |
Cc: | pgsql-sql(at)postgresql(dot)org |
Subject: | Re: Loading lots of data in a SQL command |
Date: | 2006-01-04 09:21:22 |
Message-ID: | 43BB9392.5080805@archonet.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-sql |
frank church wrote:
> I am load lots of data via SQL into a database and wrapping it into transactions
> speeds it up.
>
> However this fails a number of times. The queries results are logged so it is
> easy for me to find problem records.
>
> However a single failure causes the whole transaction to fail.
>
> Is there a setting or feature that allows which allows the same performance as
> transactions, without causing the whole process to fail, like a delayed updates
> or write mechanism of some sort.
Not as it stands. I tend to use a small perl wrapper myself that loads
in batches of e.g. 10000 rows and if there is an error deal with it
separately.
I seem to recall it being discussed as a built-in feature recently
though, so there might be someone working on it for a future version.
> It is something I would like to set in that particular data looad.
You might find the "pgloader" project meets your needs exactly:
http://pgfoundry.org/projects/pgloader/
--
Richard Huxton
Archonet Ltd
From | Date | Subject | |
---|---|---|---|
Next Message | Mario Splivalo | 2006-01-04 14:08:45 | Regular Expression Matching problem... |
Previous Message | Michael Fuhr | 2006-01-04 02:42:47 | Re: getting a query column to return 0 if nothing matches |