From: | Lew <noone(at)lwsc(dot)ehost-services(dot)com> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: is postgres a good solution for billion record data.. what about mySQL? |
Date: | 2009-10-25 02:00:06 |
Message-ID: | hc0bf6$emh$1@news.albasani.net |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Raymond O'Donnell wrote:
> On 24/10/2009 20:46, Tawita Tererei wrote:
>> In addition to this what about MySQL, how much data (records) that can be
>> managed with it?
>>
>> regards
>>
>> On Sun, Oct 25, 2009 at 3:32 AM, shahrzad khorrami <
>> shahrzad(dot)khorrami(at)gmail(dot)com> wrote:
>>
>>> is postgres a good solution for billion record data, think of 300kb data
>>> insert into db at each minutes, I'm coding with php
>>> what do you recommend to manage these data?
>
> I know that many people on this list manage very large databases with
> PostgreSQL. I haven't done it myself, but I understand that with the
> right hardware and good tuning, PG will happily deal with large volumes
> of data; and 300kb a minute isn't really very much by any standards.
>
> You can get a few numbers here: http://www.postgresql.org/about/
I know folks who've successfully worked with multi-terabyte databases with PG.
--
Lew
From | Date | Subject | |
---|---|---|---|
Next Message | Craig Ringer | 2009-10-25 13:20:38 | Re: auto-filling a field on insert |
Previous Message | Tom Lane | 2009-10-24 23:29:27 | Re: How to determine the operator class that an operator belongs to |