Re: DBD::Pg transaction issues

From: Raj Mathur <raju(at)linux-delhi(dot)org>
To: pgsql-sql(at)postgresql(dot)org
Subject: Re: DBD::Pg transaction issues
Date: 2003-04-07 08:38:13
Message-ID: 16017.14581.156730.440857@mail.linux-delhi.org
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-sql

Hi Rajesh,

>>>>> "Rajesh" == Rajesh Kumar Mallah <mallah(at)trade-india(dot)com> writes:

Rajesh> Dear Raju,

Rajesh> All subsequest statements (DML/DDL/SELECT) are ignored
Rajesh> after the first ERROR caused by any statement in the
Rajesh> transaction. the current transaction must be rollbacked
Rajesh> and new one started.

Rajesh> I can think of two approaches:

Rajesh> a) Prevent the error situation from arising may be by
Rajesh> first querying if the reference exists.

Which is what I'm doing now; it's an inelegant and inefficient
solution.

Rajesh> b) commit when the operation succeds and rollback when
Rajesh> there is ERROR from sql.

That's not an alternative, since I need to commit either all the
record in the file or none of them.

Rajesh> In perl-DBI eval is gnerally used for catching such SQLs
Rajesh> without aborting the program.

Am using eval's, but the transaction gets aborted in any case. The
eval is only useful for my program to trap the error. I'm trying to
find out how I can prevent the transaction mechanism from treating
errors on the database as logical errors.

So is there no way to prevent PostgreSQL (or DBI) from aborting the
current transaction if an operation on the database fails?

Regards,

-- Raju

Rajesh> while (true ) {

Rajesh> eval { <DBD::Pg operations here> }; if ($@) { $dbh ->
Rajesh> commit(); } else { $dbh -> rollback(); } }

Rajesh> in both cases you loose on performance in forfer case to
Rajesh> to checking overhead and in laters repeated comitting.

Rajesh> regds mallah.

Rajesh> On Monday 07 Apr 2003 10:01 am, Raj Mathur wrote:
>> Hi,
>>
>> Am using DBD::Pg with PostgreSQL 7.2.3, Perl 5.8.0. I have a
>> set of updates to a table which has (a) references to another
>> table and (b) possibly duplicates.
>>
>> The data is in a text file, which is read record by record and
>> appended to the database table. A transactions spans the
>> complete reading of the text file.
>>
>> If the cross-reference field in the file doesn't exist in the
>> referenced table I want to ignore the record.
>>
>> If the record already exists in the table I want to perform
>> some updates to the existing data.
>>
>> The problem is that the first record in the text file that has
>> an invalid reference, or one that already exists, causes the
>> transaction to abort and all subsequent updates from the file
>> to fail. Is there any way to tell DBI/PostgreSQL that it
>> should continue the transaction until the program directs it to
>> commit/rollback?
>>
>> Tried the following so far:
>>
>> Set RaiseError to null. No effect.
>>
>> Currently manually checking for duplicates/missing referenced
>> records and taking appropriate action when found. Pretty
>> inelegant.
>>
>> Pseudocode:
>>
>> open text file begin transaction while read text record write
>> into table if write failed due to duplicate read existing
>> record update values in existing record rewrite record else if
>> write failed due to missing reference ignore record else mark
>> file as bad
>>
>> if file not bad commit else rollback
>>
>> Hope this is the right list to be asking on.
>>
>> Regards,
>>
>> -- Raju

--
Raj Mathur raju(at)kandalaya(dot)org http://kandalaya.org/
It is the mind that moves

In response to

Responses

Browse pgsql-sql by date

  From Date Subject
Next Message patrick 2003-04-07 08:59:56 Re: DBD::Pg transaction issues
Previous Message Rajesh Kumar Mallah 2003-04-07 07:38:02 Re: DBD::Pg transaction issues