From: | "Areski Belaid" <areski5(at)hotmail(dot)com> |
---|---|
To: | "pgsql-general" <pgsql-general(at)postgresql(dot)org> |
Subject: | The Last Optimization |
Date: | 2002-09-06 10:50:42 |
Message-ID: | OE50CTmlhtqx8IPXyje000096f0@hotmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
I have a huge table with 14 field and few million of data...
My application Php/Pg start to be impossible to use.
Redhat 7.3
Dual PIII 900Mhz System
2GB RAM
I did already a some optimization optimization :
max_connections = 64
shared_buffers = 32000
sort_mem = 64336
fsync = false
---
echo 128000000 > /proc/sys/kernel/shmmax
also Vaccum,analyze and Index
---
This optimization was enough at the beginning but NOT now with some
million of instance.
So WHAT I CAN DO ??? USE ORACLE ???
I Think maybe to split my mean table to different table Mean_a Mean_b
... Mean_z ???
IF it's the way someone where I can find doc or help about howto split
table ???
I M lost !!! ;)
Areski
From | Date | Subject | |
---|---|---|---|
Next Message | Markus Wollny | 2002-09-06 11:02:28 | Re: Problem with restoring dump (may be tsearch-related) |
Previous Message | Markus Wollny | 2002-09-06 10:15:55 | Re: Problem with restoring dump (may be tsearch-related) |