From: | Yeb Havinga <yebhavinga(at)gmail(dot)com> |
---|---|
To: | Geoffrey <lists(at)serioustechnology(dot)com> |
Cc: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: database connections and presenting data on the web |
Date: | 2010-03-18 13:25:58 |
Message-ID: | 4BA229E6.6000400@gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Geoffrey wrote:
> We are trying to determine the best solution for a web based
> application. We have 13 databases (separate postmaster for each
> database) that we need to retrieve data from in order to produce the
> web page. This data is changing on a regular basis. Question is:
>
> 1. Do we:
>
> for database in 1-13;do
> connect database
> retrieve data
> disconnect database
> done
> display data
>
> 2. Or, do we have a common table for all databases that a daemon keeps
> updated and simply pull the data from that table?
>
> The data that's being retrieved is changing literally by the minute.
>
> The cgi code is perl.
3. Like 1 but with the use of a connection pooler like pgpool. Not sure
if pgpool supports asynchronous queries, but that would help as well by
pulling data from the 13 databases in parallel instead of serial: get
the queries onto the 13 servers without waiting for results, then as
soon as data is ready, get results and display.
regards,
Yeb Havinga
From | Date | Subject | |
---|---|---|---|
Next Message | akp geek | 2010-03-18 13:35:03 | tables getting bloated |
Previous Message | Geoffrey | 2010-03-18 13:19:07 | database connections and presenting data on the web |