From: | Scott Marlowe <scott(dot)marlowe(at)gmail(dot)com> |
---|---|
To: | Jay Manni <JManni(at)fireeye(dot)com> |
Cc: | "pgsql-performance(at)postgresql(dot)org" <pgsql-performance(at)postgresql(dot)org> |
Subject: | Re: High Frequency Inserts to Postgres Database vs Writing to a File |
Date: | 2009-11-04 03:35:19 |
Message-ID: | dcc563d10911031935q5edf8961ye7812c161ea6e762@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-performance |
On Tue, Nov 3, 2009 at 8:12 PM, Jay Manni <JManni(at)fireeye(dot)com> wrote:
> Hi:
>
>
>
> I have an application wherein a process needs to read data from a stream and
> store the records for further analysis and reporting. The data in the stream
> is in the form of variable length records with clearly defined fields – so
> it can be stored in a database or in a file. The only caveat is that the
> rate of records coming in the stream could be several 1000 records a second.
>
>
>
> The design choice I am faced with currently is whether to use a postgres
> database or a flat file for this purpose. My application already maintains a
A common approach is to store them in flat files, then insert the flat
files at a later time so that if the db falls behind no data is lost.
From | Date | Subject | |
---|---|---|---|
Next Message | David Saracini | 2009-11-04 03:42:08 | Re: High Frequency Inserts to Postgres Database vs Writing to a File |
Previous Message | Jay Manni | 2009-11-04 03:12:29 | High Frequency Inserts to Postgres Database vs Writing to a File |