| From: | Sam Varshavchik <mrsam(at)courier-mta(dot)com> |
|---|---|
| To: | |
| Cc: | Postgres JDBC <pgsql-jdbc(at)postgresql(dot)org> |
| Subject: | Re: Recommended technique for large imports? |
| Date: | 2002-09-15 00:01:42 |
| Message-ID: | 1032048102.814881.6211.501.oak@ny.email-scan.com |
| Views: | Whole Thread | Raw Message | Download mbox | Resend email |
| Thread: | |
| Lists: | pgsql-general pgsql-jdbc |
Stephen Bacon writes:
> Now I know the COPY command is much faster because it doesn't update the
> indexes after every row insert, but building that and passing it via
> jdbc seems iffy (or C, PHP, etc. for that matter).
I think someone was working on a COPY implementation for jdbc, but I don't
think it's there yet.
> Can anyone give a recommended technique for this sort of process?
Feed a few thousand INSERTs to addBatch(), then call executeBatch(). That
seems to be the fastest way to import data, at this time.
| From | Date | Subject | |
|---|---|---|---|
| Next Message | Stephan Szabo | 2002-09-15 00:20:15 | Re: Wht the SEQ Scan ? |
| Previous Message | Glen Eustace | 2002-09-14 23:26:56 | Re: Panic - Format has changed |
| From | Date | Subject | |
|---|---|---|---|
| Next Message | snpe | 2002-09-15 14:41:00 | Patch for getProcedureColumns |
| Previous Message | Jeff Davis | 2002-09-14 23:24:54 | Re: Recommended technique for large imports? |