Re: Streaming large data into postgres [WORM like applications]

From: "John D(dot) Burger" <john(at)mitre(dot)org>
To: Postgres General <pgsql-general(at)postgresql(dot)org>
Subject: Re: Streaming large data into postgres [WORM like applications]
Date: 2007-05-14 12:51:14
Message-ID: 799E153A-81AC-4C66-92AA-9A821A60D0BC@mitre.org
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

Dhaval Shah wrote:

> 2. Most of the streamed rows are very similar. Think syslog rows,
> where for most cases only the timestamp changes. Of course, if the
> data can be compressed, it will result in improved savings in terms of
> disk size.

If it really is usually just the timestamp that changes, one way to
"compress" such data might be to split your logical row into two
tables. First table has all the original columns but the timestanp,
plus an ID. Second table has the timestamp and a foreign key into
the first table. Depending on how wide your original row is, and how
often it's only the timestamp that changes, this could result in
decent "compression".

Of course, now you need referential integrity.

- John D. Burger
MITRE

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Ron Johnson 2007-05-14 13:03:03 Re: primary key index
Previous Message Sim Zacks 2007-05-14 12:28:15 IGNORE Re: explain problem