Re: High Frequency Inserts to Postgres Database vs Writing to a File

From: David Saracini <dsaracini(at)yahoo(dot)com>
To: Jay Manni <JManni(at)FireEye(dot)com>, "pgsql-performance(at)postgresql(dot)org" <pgsql-performance(at)postgresql(dot)org>
Subject: Re: High Frequency Inserts to Postgres Database vs Writing to a File
Date: 2009-11-04 03:42:08
Message-ID: 582494.35747.qm@web180304.mail.gq1.yahoo.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-performance

"could be several 1000 records a second."

So, are there periods when there are no/few records coming in? Do the records/data/files really need to be persisted?

The following statement makes me think you should go the flat file route:

"The advantage of running complex queries to mine the data in various different ways is very appealing"

Please don't be offended, but that sounds a little like feature creep. I've found that it's best to keep it simple and don't do a bunch of work now for what might be requested in the future.

I know it's not exactly what you were looking for... Just food for thought.

Best of luck!

David

________________________________
From: Jay Manni <JManni(at)FireEye(dot)com>
To: "pgsql-performance(at)postgresql(dot)org" <pgsql-performance(at)postgresql(dot)org>
Sent: Tue, November 3, 2009 7:12:29 PM
Subject: [PERFORM] High Frequency Inserts to Postgres Database vs Writing to a File


Hi:

I have an application
wherein a process needs to read data from a stream and store the records for
further analysis and reporting. The data in the stream is in the form of
variable length records with clearly defined fields – so it can be stored in a
database or in a file. The only caveat is that the rate of records coming in
the stream could be several 1000 records a second.

The design choice I am
faced with currently is whether to use a postgres database or a flat file for
this purpose. My application already maintains a postgres (8.3.4) database for
other reasons – so it seemed like the straightforward thing to do. However I am
concerned about the performance overhead of writing several 1000 records a
second to the database. The same database is being used simultaneously for
other activities as well and I do not want those to be adversely affected by
this operation (especially the query times). The advantage of running complex
queries to mine the data in various different ways is very appealing but the
performance concerns are making me wonder if just using a flat file to store
the data would be a better approach.

Anybody have any
experience in high frequency writes to a postgres database?

- Jay

In response to

Browse pgsql-performance by date

  From Date Subject
Next Message Greg Smith 2009-11-04 04:42:48 Re: High Frequency Inserts to Postgres Database vs Writing to a File
Previous Message Scott Marlowe 2009-11-04 03:35:19 Re: High Frequency Inserts to Postgres Database vs Writing to a File