From: | tv(at)fuzzy(dot)cz |
---|---|
To: | "sathiya psql" <sathiya(dot)psql(at)gmail(dot)com> |
Cc: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: indexing - creates problem |
Date: | 2008-03-05 16:15:26 |
Message-ID: | 25668.89.103.151.229.1204733726.squirrel@sq.gransy.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
> I am having a table with more than 1000 records, i am not having index in
> that, while executing that query it occupies the processor..
1000 rows is not much - I guess the index is not necessary at all, as the
traditional sequential scan is faster than index scan (due to random
access vs. sequential access).
But you have not provided enough information, so we can't give you precise
answer. You should answer at least these questions:
0) What version of postgresql (and on what OS) are you running? What
machine is it running on?
1) What is the structure of the table? What columns does have, etc. Post
the CREATE script, or a similar description.
2) What query are you executing? Post the query as well as an explain plan
for it (EXPLAIN command before the SELECT).
3) Have you analyzed the table before executing the query? Have you
vacuumed the table recently?
Tomas
From | Date | Subject | |
---|---|---|---|
Next Message | Douglas McNaught | 2008-03-05 16:20:58 | Re: Import file into bytea field in SQL/plpgsql? |
Previous Message | Erwin Brandstetter | 2008-03-05 16:14:24 | Re: Import file into bytea field in SQL/plpgsql? |