Storing large files in multiple schemas: BLOB or BYTEA

From: <tigran2-postgres(at)riatest(dot)com>
To: <pgsql-general(at)postgresql(dot)org>
Subject: Storing large files in multiple schemas: BLOB or BYTEA
Date: 2012-10-10 10:16:53
Message-ID: 007201cda6d0$61813b00$2483b100$@riatest.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

Hi,

I need to store large files (from several MB to 1GB) in Postgres database.
The database has multiple schemas. It looks like Postgres has 2 options to
store large objects: LOB and BYTEA. However we seem to hit problems with
each of these options.

1. LOB. This works almost ideal, can store up to 2GB and allows streaming so
that we do not hit memory limits in our PHP backend when reading the LOB.
However all blobs are stored in pg_catalog and are not part of schema. This
leads to a big problem when you try to use pg_dump with options -n and -b to
dump just one schema with its blobs. It dumps the schema data correctly
however then it include ALL blobs in the database not just the blobs that
belong to the particular schema.

Is there a way to dump the single schema with its blobs using pg_dump or
some other utility?

2. BYTEA. These are correctly stored per schema so pg_dump -n works
correctly however I cannot seem to find a way to stream the data. This means
that there is no way to access the data from PHP if it is larger than memory
limit.

Is there any other way to store large data in Postgres that allows streaming
and correctly works with multiple schemas per database?

Thanks.

(Sorry if this double-posts on pgsql-php, I did not know which is the best
list for this question).

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Albe Laurenz 2012-10-10 10:31:27 Re: libpq-how to add a schema to search path
Previous Message Divakar Singh 2012-10-10 10:11:52 libpq-how to add a schema to search path