How to handle a large DB and simultaneous accesses?

From: "Charles A(dot) Landemaine" <landemaine(at)gmail(dot)com>
To: pgsql-performance(at)postgresql(dot)org
Subject: How to handle a large DB and simultaneous accesses?
Date: 2006-01-10 14:41:27
Message-ID: e6575a30601100641s61317c63o@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-performance

Hello,

I have to develop a companies search engine (looks like the Yellow
pages). We're using PostgreSQL at the company, and the initial DB is
2GB large, as it
has companies from the entire world, with a fair amount of information.

What reading do you suggest so that we can develop the search engine
core, in order that the result pages show up instantly, no matter the
heavy load and
the DB size. The DB is 2GB but should grow to up to 10GB in 2 years,
and there should be 250,000 unique visitors per month by the end of
the year.

Are there special techniques? Maybe there's a way to sort of cache
search results? We're using PHP5 + phpAccelerator.
Thanks,

--
Charles A. Landemaine.

Responses

Browse pgsql-performance by date

  From Date Subject
Next Message Greg Stark 2006-01-10 15:11:18 Re: NOT LIKE much faster than LIKE?
Previous Message Ron 2006-01-10 14:33:20 Re: help tuning queries on large database