From: | Rolland Crunk <rc(at)aenet(dot)net> |
---|---|
To: | pgsql-interfaces(at)postgresql(dot)org |
Subject: | JDBC - large objects |
Date: | 1999-07-06 09:03:54 |
Message-ID: | 4.1.19990706020226.00c3f8e0@crunk.net |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-interfaces |
I am having some problem getting the jdbc driver to work properly with
large objects using standard jdbc interfaces. The tables are pretty much
standard relational tables except for one column that I serialize
implementations of the java.security.acl.Acl as objects into.
The error I get is: ERROR: lo_write: invalid large obj descriptor (0)
This was the same error I got running the blobtest until I applied Tatsuo Ishii's
patch I found in the mailing list archives. I tried the same thing in my code
(turn on explicit transactions when storing a blob) but it doesn't seem to
have any effect.
I have tried defining the acl field in my create table statement as both:
: :
acl oid,
: :
and
: :
acl char[]
: :
And see same thing.
The same (java) code runs fine using Oracle 8 and their thin driver.
I guess what I need to know is: is what I am trying to do possible using
PostgreSQL/JDBC without using the PostgreSQL extensions? (not an
option for me). If so, what should I use for serialized columns in the
create table sql statement? Can it be done without turning off autocommit?
Thanks in advance for any help anyone can provide.
Cordially,
rc
ps: My environment is:
Solaris 2.7 (intel)
jdk 1.2 (jdk 1.1 fares no better)
PostgreSQL 6.5
From | Date | Subject | |
---|---|---|---|
Next Message | Peter Mount | 1999-07-06 11:36:00 | RE: [INTERFACES] JDBC - large objects |
Previous Message | Rolland Crunk | 1999-07-06 08:56:40 | JDBC - large objects |