From: | Albe Laurenz <laurenz(dot)albe(at)wien(dot)gv(dot)at> |
---|---|
To: | liuyuanyuan <liuyuanyuan(at)highgo(dot)com(dot)cn>, Chris Travers <chris(dot)travers(at)gmail(dot)com>, tv <tv(at)fuzzy(dot)cz> |
Cc: | pgsql-general <pgsql-general(at)postgresql(dot)org> |
Subject: | Re: inserting huge file into bytea cause out of memory |
Date: | 2013-08-07 06:56:21 |
Message-ID: | A737B7A37273E048B164557ADEF4A58B17BF47B3@ntex2010a.host.magwien.gv.at |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
liuyuanyuan wrote:
> By the way, my project is about migrating Oracle data of BLOB type to
> PostgreSQL database. The out of memory error occurred between migrating
> Oracle BLOB to PostgreSQL bytea. Another question, if I can't migrate BLOB to bytea,
> how about oid type ?
Large Objects (I guess that's what you mean with "oid" here)
might be the better choice for you, particularly since you
have out of memory problems.
While bytea is always written in one piece, you can stream
large objects by reading and writing them in smaller chunks.
Moreober, large objects have a bigger size limit than
the 1GB of bytea.
The downside is that the API is slightly more complicated,
and you'll have to take care that the large object gets
deleted when you remove the last reference to it from your
database.
Yours,
Laurenz Albe
From | Date | Subject | |
---|---|---|---|
Next Message | Michael Paquier | 2013-08-07 07:26:12 | Re: inserting huge file into bytea cause out of memory |
Previous Message | BladeOfLight16 | 2013-08-07 04:00:41 | Staging Database |