From: | bbeyer(at)purdue(dot)edu |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | Possible limit on transaction size? |
Date: | 2008-09-08 13:17:49 |
Message-ID: | 1220879869.48c525fd40b31@webmail.purdue.edu |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Hello,
I was curious if there was a known size limit for Postgres transactions. In
order to import data into my database, my Java application begins a transaction,
imports the data (into several different tables), and then commits the
transaction on success. It works great on small data sets, but on the large
ones, it doesn't work so well.
About 150 million records into the import process, I get the following error:
ERROR: lock AccessShareLock on object 51533/51769/0 is already held
CONTEXT: SQL statement "INSERT INTO table_name (col1, col2, col3, col4) VALUES
(val1, val2, val3, val4)"
PL/pgSQL function "create_import" line 19 at SQL statement
STATEMENT: select * from create_import($1,$2,$3,$4,$5,$6) as result
I know my server can handle this much data (24GB RAM, 2 TB SAS disks, etc.), but
it doesn't seem like Postgres likes the large transactions.
Any thoughts?
Thank you for your time,
Brian Beyer
Purdue University
bbeyer(at)purdue(dot)edu
From | Date | Subject | |
---|---|---|---|
Next Message | Randal T. Rioux | 2008-09-08 13:38:45 | Re: 64-bit Compile Failure on Solaris 10 with OpenSSL |
Previous Message | joe speigle | 2008-09-08 12:54:36 | ERROR: cache lookup failed for relation |