From: | Dino Vliet <dino_vliet(at)yahoo(dot)com> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | can postgresql handle these large tables |
Date: | 2004-08-18 18:24:04 |
Message-ID: | 20040818182404.66169.qmail@web40102.mail.yahoo.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Hi folks,
I'm busy with analyzing some data and therefore will
have to store 2 big tables containing 50 million obs &
25 million obs. Selecting some interesting obs can
decrease these tables with maybe a factor 10 but
eventually I will have to combine them using a join.
These tables contain almost 20 colums and after
selecting the appropriate ones I will have maybe 10
colums.
Off course I will have to make use of indexes and try
to optimize the settings in postgresql.conf file like:
geqo=true
geqo_treshold=11
geqo_effort= 1
geqo_generations=5
But my main question is whether postgresql will be
able to handle these large volumes of data, or should
I ask whether my hardware will be capable of working
with these large tables.
Answers, Ideas, remarks...everything is welcome folks,
Many thanks in advance, it is MUCH appreciated what is
done in this list!!
__________________________________
Do you Yahoo!?
Yahoo! Mail - Helps protect you from nasty viruses.
http://promotions.yahoo.com/new_mail
From | Date | Subject | |
---|---|---|---|
Next Message | Mike Mascari | 2004-08-18 18:24:08 | Re: Could not create a table named "USER" under postgreSQL |
Previous Message | Marius Andreiana | 2004-08-18 18:20:52 | Re: [ADMIN] tracking db changes / comparing databases |