From: | Alexander Antonakakis <alexis(at)maich(dot)gr> |
---|---|
To: | pgsql-hackers(at)postgresql(dot)org |
Subject: | Big Database |
Date: | 2004-11-11 06:20:11 |
Message-ID: | 1100154011.391491@athnrd02 |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-hackers |
I would like to ask the more experienced users on Postgres database a
couple of questions I have on a db I manage with a lot of data. A lot of
data means something like 15.000.000 rows in a table. I will try to
describe the tables and what I will have to do on them :)
There is a table that has product data in the form of
Table product:
product_id varchar(8),
product_name text
and
product actions table:
product_id varchar(8),
flow char(1),
who int,
where int,
value float.
I will have to make sql queries in the form "select value from
product_actions where who='someone' and where='somewhere' and maybe make
also some calculations on these results. I allready have made some
indexes on these tables and a view that joins the two of them but I
would like to ask you people if someone is using such a big db and how
can I speed up things as much as it is possible on this ... these
product_actions tables exists for each year from 1988 till 2003 so this
means a lot of data...
Thanks in Advance
From | Date | Subject | |
---|---|---|---|
Next Message | John Hansen | 2004-11-11 06:52:19 | Re: MAX/MIN optimization via rewrite (plus query rewrites generally) |
Previous Message | Tom Lane | 2004-11-11 06:08:39 | Re: MAX/MIN optimization via rewrite (plus query rewrites generally) |