Re: The Last Optimization

From: Christopher Kings-Lynne <chriskl(at)familyhealth(dot)com(dot)au>
To: Areski Belaid <areski5(at)hotmail(dot)com>
Cc: pgsql-php(at)postgresql(dot)org
Subject: Re: The Last Optimization
Date: 2002-09-07 05:27:02
Message-ID: 20020907132546.M42335-100000@houston.familyhealth.com.au
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-php

You'll have to post your complete schema and the actual queries you are
runing that are so slow. Postgres can easily handle such large tables, so
there's probably more improvements you can make to the speed.

Chris

On Fri, 6 Sep 2002, Areski Belaid wrote:

> I have a huge table with 14 field and few million of data...
> My application Php/Pg start to be impossible to use.
>
> Redhat 7.3
> Dual PIII 900Mhz System
> 2GB RAM
>
> I did already a some optimization optimization :
>
> max_connections = 64
> shared_buffers = 32000
> sort_mem = 64336
> fsync = false
> ---
> echo 128000000 > /proc/sys/kernel/shmmax
>
> also Vaccum,analyze and Index
>
> ---
>
> This optimization was enough at the beginning but NOT now with some
> million of instance.
>
> So WHAT I CAN DO ??? USE ORACLE ???
>
> I Think maybe to split my mean table to different table Mean_a Mean_b
> ... Mean_z ???
> IF it's the way someone where I can find doc or help about howto split
> table ???
>
> I M lost !!! ;)
>
>
>
> Areski
>
> ---------------------------(end of broadcast)---------------------------
> TIP 3: if posting/reading through Usenet, please send an appropriate
> subscribe-nomail command to majordomo(at)postgresql(dot)org so that your
> message can get through to the mailing list cleanly
>

In response to

Browse pgsql-php by date

  From Date Subject
Next Message arun kv 2002-09-09 11:21:31 Error"Unable to connect to PostgresSQL server: connectDBStart()"
Previous Message Areski Belaid 2002-09-06 10:51:27 The Last Optimization