| From: | Nelson Arapé <narape(at)ica(dot)luz(dot)ve> |
|---|---|
| To: | pgsql-jdbc(at)postgresql(dot)org |
| Subject: | Re: process large tables |
| Date: | 2005-04-14 20:50:21 |
| Message-ID: | 200504141650.21673.narape@ica.luz.ve |
| Views: | Whole Thread | Raw Message | Download mbox | Resend email |
| Thread: | |
| Lists: | pgsql-jdbc |
From the documentation
(http://jdbc.postgresql.org/documentation/80/query.html#query-with-cursor)
"By default the driver collects all the results for the query at once. This
can be inconvenient for large data sets so the JDBC driver provides a means
of basing a ResultSet on a database cursor and only fetching a small number
of rows.
..."
With a cursor you fetch rows by pieces. It is well explained in the
documentation.
Bye
Nelson Arapé
El Jue 14 Abr 2005 16:37, Kristina Magwood escribió:
> Hi,
> I am trying to process a large table. Unfortunately, using select * from
> table gives me a ResultSet that is too large.
> The java runs out of memory even if I boost the vm memory.
> Is there any way I can programmatically (in java) retrieve say 10,000
> records at a time without knowing anything specific about the table? Then,
> when I am done with those records, retrieve the next 10,000, etc?
>
> Thank you in advance for any help you can spare.
> Kristina
| From | Date | Subject | |
|---|---|---|---|
| Next Message | Oliver Jowett | 2005-04-14 21:32:52 | Re: Bit string type in PreparedStatement |
| Previous Message | Kristina Magwood | 2005-04-14 20:37:52 | process large tables |