best way to write large data-streams quickly?

From: Mark Moellering <markmoellering(at)psyberation(dot)com>
To: pgsql-general(at)postgresql(dot)org
Subject: best way to write large data-streams quickly?
Date: 2018-04-09 15:49:10
Message-ID: CAA0uU3XCiReRsK9-4Zsk0Mdhan1dG2Q4dj6iPQiEc-kOJumLyw@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

Everyone,

We are trying to architect a new system, which will have to take several
large datastreams (total of ~200,000 parsed files per second) and place
them in a database. I am trying to figure out the best way to import that
sort of data into Postgres.

I keep thinking i can't be the first to have this problem and there are
common solutions but I can't find any. Does anyone know of some sort
method, third party program, etc, that can accept data from a number of
different sources, and push it into Postgres as fast as possible?

Thanks in advance,

Mark Moellering

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Steve Atkins 2018-04-09 16:01:08 Re: best way to write large data-streams quickly?
Previous Message Tom Lane 2018-04-09 15:04:04 Re: algo for canceling a deadlocked transaction