Bulk load billions of records into Postgres cluster

From: balasubramaniam <balasubramaniam(dot)b(at)gmail(dot)com>
To: pgsql-novice(at)postgresql(dot)org
Subject: Bulk load billions of records into Postgres cluster
Date: 2017-07-01 04:29:32
Message-ID: CACFhHyuehAgUm6cQ4RbELZC5HSnc9Zsi9hpQjo+g2q+kVW1i-Q@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-novice

Hi All,

We have a proven NoSQL production setup with a few billion rows. We are
planning to move towards a more structured data model with few tables.

I am looking for a completely open-source and battle-tested database and
Postgres seems to be the right start.

Due to our increasing scale demands, I am planning to start with Postgresql
cluster. Ability to ingest data at scale, around a few TBs, in the fastest
possible duration is highly critical for our use case. I have read
through official
documentation <https://www.postgresql.org/docs/current/static/populate.html>
and
also about COPY FROM command, but none of these talk specifically about
cluster setup.

1) What is the standard and fastest way to ingest billions of records into
Postgres at scale.
2) Is there a tool to generate the sql script for COPY FROM command for
ready use? I want to avoid writing another custom tool and maintain it.

Thanks in advance,
bala

Responses

Browse pgsql-novice by date

  From Date Subject
Next Message Rounak Jain 2017-07-05 11:20:40 where and how to store calculated data?
Previous Message Neha Khatri 2017-06-26 12:57:53 Doc compilation on Solaris