From: | Tao Yang <tyang(at)cloverworxs(dot)com> |
---|---|
To: | pgsql-jdbc(at)postgresql(dot)org |
Subject: | Out of memory exception |
Date: | 2004-04-16 21:29:39 |
Message-ID: | 40805043.6020502@cloverworxs.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-jdbc |
Hi,
First I would like to think you for your wonderful work on Postgresql
and the jdbc driver, it is all wonderful.
We run into a particular problem when I try to upload a big size blob
into the postgresql. In our web application, we allow user to upload
file to the server which we store it as Blob inside the postgresql
database. during the test, we find if the file size is too big, then an
"OutOfMemory" exception will be thrown. I tried to increase the jvm heap
size from the default 64MB (on linux) to 96MB, 128MB, and 512MB, it does
help a bit, but not resolve it unfortunately - I could upload a file of
size of 2MB when it is 96MB heap size, and 13MB when it is 512MB (only
once, when I do it 2nd time in a row, it failed again with that
"OutOfMemory" error). then I tried against MS SQL Server, it does not
have this problem at all - with the default jvm size, it handles that
13MB file very well. all other code is the same for posgresql and mssql
in our app except loading the driver.
Is there any possibility the postgresql jdbc driver does not handle
large object very well, for example, use stream for loading that big file?
Do you have any clue why we have those problems?
Thanks a lot,
Tao
From | Date | Subject | |
---|---|---|---|
Next Message | Barry Lind | 2004-04-16 21:47:37 | Re: Performance analysis of server-parsed PreparedStatements |
Previous Message | James Robinson | 2004-04-16 18:26:47 | Performance analysis of server-parsed PreparedStatements ... |