From: | Sam Mason <sam(at)samason(dot)me(dot)uk> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: Use PSQLFS for photo storage |
Date: | 2009-01-13 23:54:20 |
Message-ID: | 20090113235420.GM3008@frubble.xen.chris-lamb.co.uk |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On Tue, Jan 13, 2009 at 03:28:18PM -0600, Jason Long wrote:
> Steve Atkins wrote:
> >On Jan 13, 2009, at 10:34 AM, Jason Long wrote:
> >>I would like to use PSQLFS(http://www.edlsystems.com/psqlfs/)
> >>to store 100 GB of images in PostgreSQL.
> >>
> >>Is there a better way to load 20,000 plus files reliably into Postgres?
That would imply that they're around 5MB on average? If they're all
under, say, 20MB (or maybe even much more) you should be able to handle
it by doing the most naive things possible.
> I just want an easy way to load the files into the DB and their original
> path they were loaded from.
>
> Is possible through SQL to load a file into a bytea column?
You'd need to generate the SQL somehow; if you know python it's probably
a pretty easy 20 or 30 lines of code to get this working. psycopg seems
to be the recommend way of accessing PG with python and you basically
want to be doing something like:
import psycopg2;
filename = "myimage.jpeg"
conn = psycopg2.connect("");
conn.cursor().execute(
"INSERT INTO pictures (filename,data) VALUES (%s,%s);",
[filename,psycopg2.Binary(open(filename,"rb").read())]);
conn.commit();
This seems to do the right thing for me, and obviously needs to be put
into a loop of some sort. But it'll hopefully get you started.
Sam
From | Date | Subject | |
---|---|---|---|
Next Message | Jason Long | 2009-01-14 00:22:34 | Re: Use PSQLFS for photo storage |
Previous Message | tyrrill_ed | 2009-01-13 21:38:12 | Problem with pg_dump |