| From: | Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us> |
|---|---|
| To: | Michael Paquier <michael(dot)paquier(at)gmail(dot)com> |
| Cc: | PostgreSQL mailing lists <pgsql-hackers(at)postgresql(dot)org> |
| Subject: | Re: Similar to csvlog but not really, json logs? |
| Date: | 2014-08-27 02:41:13 |
| Message-ID: | 17204.1409107273@sss.pgh.pa.us |
| Views: | Whole Thread | Raw Message | Download mbox | Resend email |
| Thread: | |
| Lists: | pgsql-hackers |
Michael Paquier <michael(dot)paquier(at)gmail(dot)com> writes:
> Now what about a json format logging with one json object per log entry?
> A single json entry would need more space than a csv one as we need to
> track the field names with their values. Also, there is always the
> argument that if an application needs json-format logs, it could use
> csvlog on Postgres-side and do the transformation itself. But wouldn't
> it be a win for application or tools if such an option is available
> in-core?
I think the extra representational overhead is already a good reason to
say "no". There is not any production scenario I've ever heard of where
log output volume isn't a consideration.
regards, tom lane
| From | Date | Subject | |
|---|---|---|---|
| Next Message | Peter Eisentraut | 2014-08-27 02:45:40 | Re: minor config doc update |
| Previous Message | Peter Eisentraut | 2014-08-27 02:33:22 | Re: alter user set local_preload_libraries. |