Buffered input for big numeric input files

From: Guillaume Marçais <gus(at)nist(dot)gov>
To: pgsql-sql(at)postgresql(dot)org
Subject: Buffered input for big numeric input files
Date: 1999-05-27 20:28:56
Message-ID: Pine.LNX.4.05.9905271612260.25149-100000@dstp02.ncsl.nist.gov
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-sql

I want to fill database from columns in a numeric table in a file. I
want to place the values from each column into a separate array in the
database. I know that I can read each row in the file a fill in the arrays
one value at a time, and after reading the entire file, write each array
out to the database from memory (with an INSERT INTO statement). However,
I'd like not to have to have all of the arrays in memory. So, I'd like to
write the values into the database as they are read. Is there are
way to create a set of fixed-length arrays and as they are filled
periodically update the database and flush them -- in effect buffering the
input. This will allow me to process arbitrarily large files without
having to worry about memory requirements.

Thanks in advance for any suggestions.

Gus.

Browse pgsql-sql by date

  From Date Subject
Next Message Andy Lewis 1999-05-27 20:37:52 Re: [SQL] 2 Table Select
Previous Message Oliver Elphick 1999-05-27 19:34:54 Re: [SQL] 2 Table Select