From: | Jonathan Bartlett <johnnyb(at)eskimo(dot)com> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | lo_import for bytea columns |
Date: | 2003-11-20 16:17:27 |
Message-ID: | Pine.GSU.4.44.0311200812090.5850-100000@eskimo.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Is there an equivalent function for bytea columns that works like
lo_import?
Alternatively, is there a way to copy from a large object to a bytea
column from SQL?
Or maybe someone has another way of attacking this problem:
I've got some Perl code that does this:
undef $/;
$data = <FHFOR89MBFILE>;
$sth = $dbh->prepare("insert into data (bigbyteacolumn) values (?)");
$sth->bind_param(1, $data, DBI::SQL_BINARY);
$sth->execute;
Which has worked fine for a while, with file sizes around 10MB.
However, now I have someone who wants to use this for a file that's 89MB,
and it's taking up about 500M of memory before crashing. I'm trying to
find a less-memory-consuming way of handling this, even if just for a
temporary hack for this one file. I think what's happening is that Perl
is reading in the 89M, and then I'm guessing that either Perl or the
driver is converting that into a fully-escaped string for transfer, and
this is where the problem is occuring.
Any ideas?
Thanks,
Jonathan Bartlett
From | Date | Subject | |
---|---|---|---|
Next Message | konf | 2003-11-20 16:23:59 | Re: error durring compilation |
Previous Message | Jeff Kowalczyk | 2003-11-20 16:12:37 | Postgresql vs. MySql - need feature matrix for current versions |