From: | Xiaoyu <zhangxiaoyu912(at)gmail(dot)com> |
---|---|
To: | pgsql-jdbc(at)postgresql(dot)org |
Subject: | how to handle very large data object efficiently |
Date: | 2007-07-26 15:54:32 |
Message-ID: | 1185465272.247919.148280@k79g2000hse.googlegroups.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-jdbc |
Hi, folks:
I am a new programmer to JDBC, if the question is naive, please
forgive me.
question description:
1. store the data in a text file into database, the format of the text
file is similar to:
......
218596813 235940555 4387359 3 386658 4000 4 4
218597197 235940333 4366832 17 388842 5000 5 5
218597485 235940805 4374620 8 386226 4000 4 4
......
each record of the database corresponding to each line of the file,
and each element corresponding to each number.
2. the file is very huge, normally there are 9,000,000 lines in each
file. My current program read the file line by line and parse the line
and store the elements into the database. However, because the file is
huge, it may take days to store one file, and there are 50 similar
files need to be processed.
3. Can anyone give a better solution to take place "read line by
line"? Is there any method like manipulate block of data(equal several
lines) in JDBC? Many thanks
Xiaoyu
From | Date | Subject | |
---|---|---|---|
Next Message | Mark Lewis | 2007-07-26 16:42:22 | Re: how to handle very large data object efficiently |
Previous Message | Oliver Jowett | 2007-07-26 08:39:56 | Re: defaultAutoCommit problem with glassfish |