From: | Eduardo Vázquez Rodríguez <evazquez(at)insys-corp(dot)com(dot)mx> |
---|---|
To: | "'psql-novice(at)postgresql(dot)org'" <psql-novice(at)postgresql(dot)org> |
Subject: | Query optimization |
Date: | 2005-02-19 02:21:54 |
Message-ID: | 59B41C14544D314889E6FA384A307A9293EB26@osiris.insys-corp.com.mx |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-novice |
Hello Postgres users
I have a question regarding the optimization of a query, I have a table that
contains approximately 14 millions of rows and I want to retrieve the
following information;
SELECT username, sum(bytes_used) AS TOTAL
FROM BIGTABLE
GROUP BY username
LIMIT 100;
Username is an Index varchar
bytes_used is float8 type
But it is really slow, it has been running for 1 hour and it simply does not
ends.
While the query is running, the table is been updating with more rows.
How can I improve it?
Thanks in advanced
From | Date | Subject | |
---|---|---|---|
Next Message | Keith Worthington | 2005-02-19 03:27:49 | Expensive where clause |
Previous Message | Dan Quaroni | 2005-02-18 15:05:09 | Copy can't parse a float? |