From: | Raymond O'Donnell <rod(at)iol(dot)ie> |
---|---|
To: | nevita0305(at)hotmail(dot)com |
Cc: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: is postgres a good solution for billion record data.. what about mySQL? |
Date: | 2009-10-24 19:53:50 |
Message-ID: | 4AE35B4E.3000601@iol.ie |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On 24/10/2009 20:46, Tawita Tererei wrote:
> In addition to this what about MySQL, how much data (records) that can be
> managed with it?
>
> regards
>
> On Sun, Oct 25, 2009 at 3:32 AM, shahrzad khorrami <
> shahrzad(dot)khorrami(at)gmail(dot)com> wrote:
>
>> is postgres a good solution for billion record data, think of 300kb data
>> insert into db at each minutes, I'm coding with php
>> what do you recommend to manage these data?
I know that many people on this list manage very large databases with
PostgreSQL. I haven't done it myself, but I understand that with the
right hardware and good tuning, PG will happily deal with large volumes
of data; and 300kb a minute isn't really very much by any standards.
You can get a few numbers here: http://www.postgresql.org/about/
Ray.
--
Raymond O'Donnell :: Galway :: Ireland
rod(at)iol(dot)ie
From | Date | Subject | |
---|---|---|---|
Next Message | Pavel Stehule | 2009-10-24 20:09:35 | Re: How can I get one OLD.* field in a dynamic query inside a trigger function ? |
Previous Message | Tawita Tererei | 2009-10-24 19:46:03 | Re: is postgres a good solution for billion record data.. what about mySQL? |