From: | Sam Barnett-Cormack <s(dot)barnett-cormack(at)lancaster(dot)ac(dot)uk> |
---|---|
To: | Vasilis Ventirozos <vendi(at)cosmoline(dot)com> |
Cc: | pgsql-admin(at)postgresql(dot)org |
Subject: | Re: Are 50 million rows a problem for postgres ? |
Date: | 2003-09-08 09:16:21 |
Message-ID: | Pine.LNX.4.50.0309081014440.22692-100000@short.lancs.ac.uk |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-admin |
On Mon, 8 Sep 2003, Vasilis Ventirozos wrote:
> Hi all, i work in a telco and i have huge ammount of data, (50 million)
> but i see a lack of performance at huge tables with postgres,
> are 50 million rows the "limit" of postgres ? (with a good performance)
> i am waiting for 2004 2 billion records so i have to do something.
> Does anyone have a huge database to ask him some issues ?
>
> my hardware is good ,my indexes are good plz dont answer me something like use
> vacuum :)
I have a similarly huge number of records, as I process our web, ftp,
and rsync logs together using postgres. Works like a charm. You do have
to allow that queries are going to take a long time. I use about 6
queries to summarise a quarter's data - each run for each month, so a
total of 18 queries. These run in a little over 24 hours. And there are
many, many records per month.
--
Sam Barnett-Cormack
Software Developer | Student of Physics & Maths
UK Mirror Service (http://www.mirror.ac.uk) | Lancaster University
From | Date | Subject | |
---|---|---|---|
Next Message | Nathan | 2003-09-08 09:44:09 | Re: Performance Issues |
Previous Message | Vasilis Ventirozos | 2003-09-08 08:56:15 | Re: Are 50 million rows a problem for postgres ? |