From: | "Ron Mayer" <ron(at)intervideo(dot)com> |
---|---|
To: | <vendi(at)cosmoline(dot)com>, <pgsql-admin(at)postgresql(dot)org> |
Subject: | Re: Are 50 million rows a problem for postgres ? |
Date: | 2003-09-10 00:49:41 |
Message-ID: | POEDIPIPKGJJLDNIEMBEAEDFDJAA.ron@intervideo.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-admin |
> Hi all, i work in a telco and i have huge ammount of data, (50 million)
> but i see a lack of performance at huge tables with postgres,
> are 50 million rows the "limit" of postgres ? (with a good performance)
I have worked on a datawarehouse (postgresql 7.3) with a
pretty standard star schema with over 250 million rows on
the central 'fact' table, and anywhere from 100 to 10+ million
records in the surrounding 'dimension' tables.
The most common queries were simple joins between 3 tables, with
selects on one of the ids. These took a few (1-60) seconds.
About 500,000 new records were loaded each night; and the ETL
processing and creating some aggregates took about 11 hours/night
with 7.3, and 9 hours/night with 7.4beta.
Hope this helps.
From | Date | Subject | |
---|---|---|---|
Next Message | Gaetano Mendola | 2003-09-10 02:10:43 | Report Generator Proposal |
Previous Message | Mickey | 2003-09-10 00:31:28 |