Re: inserting huge file into bytea cause out of memory

From: Michael Paquier <michael(dot)paquier(at)gmail(dot)com>
To: Albe Laurenz <laurenz(dot)albe(at)wien(dot)gv(dot)at>
Cc: liuyuanyuan <liuyuanyuan(at)highgo(dot)com(dot)cn>, Chris Travers <chris(dot)travers(at)gmail(dot)com>, tv <tv(at)fuzzy(dot)cz>, pgsql-general <pgsql-general(at)postgresql(dot)org>
Subject: Re: inserting huge file into bytea cause out of memory
Date: 2013-08-07 07:26:12
Message-ID: CAB7nPqSLFsXonBLN_CqsAWDbrodN7SaUhrXYPheBVkWgrz7uBQ@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

On Wed, Aug 7, 2013 at 3:56 PM, Albe Laurenz <laurenz(dot)albe(at)wien(dot)gv(dot)at> wrote:
> liuyuanyuan wrote:
>> By the way, my project is about migrating Oracle data of BLOB type to
>> PostgreSQL database. The out of memory error occurred between migrating
>> Oracle BLOB to PostgreSQL bytea. Another question, if I can't migrate BLOB to bytea,
>> how about oid type ?
>
> Large Objects (I guess that's what you mean with "oid" here)
> might be the better choice for you, particularly since you
> have out of memory problems.
Take care that the limit of large objects is 2GB in Postgres 9.2 or
lower (with default block size).By thw way, you will be fine in the
case of your application. It is also worth noticing that is increased
to 4TB in 9.3.
--
Michael

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message BOUVARD Aurélien 2013-08-07 08:24:29 Replication Postgre > Oracle
Previous Message Albe Laurenz 2013-08-07 06:56:21 Re: inserting huge file into bytea cause out of memory