Re: Advice on logging strategy

From: Rob Sargent <robjsargent(at)gmail(dot)com>
To: Mike Martin <redtux1(at)gmail(dot)com>
Cc: "pgsql-general(at)postgresql(dot)org" <pgsql-general(at)postgresql(dot)org>
Subject: Re: Advice on logging strategy
Date: 2018-10-11 10:33:41
Message-ID: 76CE3E73-7564-441F-9E2B-3F5873D4A153@gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

> On Oct 11, 2018, at 4:26 AM, Mike Martin <redtux1(at)gmail(dot)com> wrote:
>
> I have a question on logging strategy
>
> I have loggin set to
> log_statement = 'all' on a network database with logging set to csv so I can import it to a logging table
>
> However the database is populated via a nightly routine downloading data via REST APIusing prepared statements
>
> This results in enormous log files which take ages to import using copy becuase each execute statement is logged with the parameters chosen
>
> Is there any way around this?
>
> I cant find any way to filter dml statements
>
> thanks
>
Do you want all the log lines in you logging table?
There was a thread yesterday (10.Oct.2018) on COPY which mention the possibility of multiple processes COPYing to same table.

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Mike Martin 2018-10-11 11:06:56 Re: Advice on logging strategy
Previous Message Mike Martin 2018-10-11 10:26:08 Advice on logging strategy