From: | Scott Marlowe <smarlowe(at)g2switchworks(dot)com> |
---|---|
To: | Thuy Nguyen <ntcacthuyvn(at)yahoo(dot)com> |
Cc: | pgsql general <pgsql-general(at)postgresql(dot)org> |
Subject: | Re: Ask about large database? |
Date: | 2007-01-18 16:12:54 |
Message-ID: | 1169136774.9586.72.camel@state.g2switchworks.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On Thu, 2007-01-18 at 02:03, Thuy Nguyen wrote:
> Hi Sir/Madam!
> I am look for RDBMS solution for my web application, size of my
> database may be about 10GB-->500GB. How PostgreSQL work well or how
> must I config it?
> My application process data locally; one database has only one
> connection (not a server-client model).
> Please tell me your suggestions!
Not being flippant here, honest, but you haven't told us enough to
really know. I've seen terabyte sized datasets that postgresql was very
fast on (relatively speaking) and gigabyte sized datasets that were a
horrible choice for.
Most of the time it's the processing methodology that determines if
pgsql is a good fit, not the size of the dataset.
Batch processing is generally not pgsql's strong suit. It can do it,
and I've used it for it many a time, but often sed / awk / mysql / grep
/ php / perl are better tools for batch processing than pgsql.
So, tell us what you're gonna do. Plus, it might be that with some
tweaks to your methods, pgsql can go from being a poor choice to being a
good one.
From | Date | Subject | |
---|---|---|---|
Next Message | Alejandro D. Burne | 2007-01-18 16:17:44 | Re: [1/2 OFF] Varlena.com inaccessible from .br (Blocked?) |
Previous Message | Devrim GUNDUZ | 2007-01-18 16:06:06 | Re: How I can read-back a serial value just inserted? |