From: | "David Wall" <d(dot)wall(at)computer(dot)org> |
---|---|
To: | "pgsql-jdbc" <pgsql-jdbc(at)postgresql(dot)org> |
Subject: | JDBC Blob API bug? |
Date: | 2002-08-29 20:14:09 |
Message-ID: | 002801c24f98$a217fea0$3201a8c0@expertrade.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-jdbc |
It's hard to fault the PG JDBC library for this, but it does appear to be a
problem with the java.sql.Blob API (or at least it's not documented well).
I'm running 7.2.2.
If you retrieve a Blob and then use the Blob.getBytes(0,blob.size()) method
to suck in the entire blob into a byte array, there is no mechanism to
"close" the Blob. So, with PG JDBC, the routine does a seek and read
against the LargeObject, but there's no mechanism to close it, so the stream
stays open. This results in strange errors in subsequent calls (like sql
exception "No results were returned by the query.").
The only workaround I've seen is to use the Blob.getBinaryStream(),suck in
the data, then close the stream which then closes the underlying
LargeObject.
Here's a utility routine I used for converting a Blob into a byte[] when
doing a SELECT:
public byte[] blobToBytes(java.sql.Blob b)
{
java.io.InputStream is = null;
try
{
is = b.getBinaryStream();
byte[] bytes = new byte[(int)b.length()];
is.read(bytes);
return bytes;
}
catch( java.sql.SQLException e )
{
return null;
}
catch( java.io.IOException e )
{
return null;
}
finally
{
try
{
if ( is != null )
is.close();
}
catch( Exception e ) {}
}
}
David
From | Date | Subject | |
---|---|---|---|
Next Message | Dave Cramer | 2002-08-29 20:25:18 | Re: Pooling Prepared Statements |
Previous Message | G.Nagarajan | 2002-08-29 18:52:40 | Re: Pooling Prepared Statements |