We have a large table (about 9,000,000 rows and total size is about 2.8 GB)
which is exported to a binary file. Postgre 8.2 is running on a Windows 2003
Small business Server which has a 2 GB RAM. When we run "copy tablename from
filepath" command, memory usage increases up to 1.8 GB and postgre raises
exception "out of memory". If we copy a small part of the table (e.g
1,000,000 rows) everything works fine.
As far as I understand, postgre is trying to load all the rows into RAM
before writing it to the database. I tried running postgre with several
different configuration parameters but the result is same. Did anybody face
a similar problem?
Kind regards
A. Ozen Akyurek