| From: | Radosław Smogura <rsmogura(at)softperience(dot)eu> |
|---|---|
| To: | Александър Шопов <lists(at)kambanaria(dot)org> |
| Cc: | <pgsql-jdbc(at)postgresql(dot)org> |
| Subject: | Re: Workarounds for getBinaryStream returning ByteArrayInputStream on bytea |
| Date: | 2010-11-24 22:04:49 |
| Message-ID: | 5cdf97054e76f67c40b74de114691762@smogura-softworks.eu |
| Views: | Whole Thread | Raw Message | Download mbox | Resend email |
| Thread: | |
| Lists: | pgsql-hackers pgsql-jdbc |
I see only two possibilities
1. Decrease fetch size, e.g. to 1.
2. Refactor schema.
Kind regards,
Radek
On Wed, 24 Nov 2010 22:50:46 +0200, Александър Шопов
<lists(at)kambanaria(dot)org>
wrote:
> Hi everyone,
> I have a table containing file contents in bytea columns.
> The functionality I am trying to achieve is having a result set
> containing such columns, iterating over them and streaming them while
> zipping them.
[...]
> So what am I options? Refactor the DB schema to use blobs rather than
> bytea? Is it impossible to have bytea read in chunks?
> Kind regards:
> al_shopov
--
----------
Radosław Smogura
http://www.softperience.eu
| From | Date | Subject | |
|---|---|---|---|
| Next Message | Andres Freund | 2010-11-24 22:15:13 | Re: profiling connection overhead |
| Previous Message | Bruce Momjian | 2010-11-24 22:04:30 | Re: duplicate connection failure messages |
| From | Date | Subject | |
|---|---|---|---|
| Next Message | Александър Шопов | 2010-11-24 22:53:31 | Re: Workarounds for getBinaryStream returning ByteArrayInputStream on bytea |
| Previous Message | Александър Шопов | 2010-11-24 20:50:46 | Workarounds for getBinaryStream returning ByteArrayInputStream on bytea |