From: | Peter Haight <peterh(at)sapros(dot)com> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | Large object insert performance. |
Date: | 2000-08-23 21:17:30 |
Message-ID: | 200008232117.OAA34910@wartch.sapros.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
I'm populating a new database from some text files. I'm using large objects
to store the body of the text files. I have a little thing setup to monitor
how fast the inserts are going. They started out at about 20/sec and have
been slowly dropping. I'm about 6% through my data and I'm already down to
2/sec and dropping. All I'm doing is inserting the large objects. No other
action is happening. Here's the portion of the script that is populating my
database:
self.db.query('begin')
body_lo = self.db.locreate(pg.INV_READ | pg.INV_WRITE)
body_lo.open(pg.INV_WRITE)
body_lo.write(puff.get('message/body', ''))
body_oid = body_lo.oid
body_lo.close()
self.db.query('end')
That is the full extent of my queries to the database. There are no tables
or indexes defined. The average size of a body is about 300 bytes, but it
goes as high as 30k.
Is there any way to speed this up? If the handling of large objects is this
bad, I think I might just store these guys on the file system.
From | Date | Subject | |
---|---|---|---|
Next Message | Dan Manczak | 2000-08-23 22:16:48 | [NEWBIE] Compilation & Installation |
Previous Message | David Steuber | 2000-08-23 21:00:05 | Re: Importing into Postgres from a csv file |