From: | Ryan Ho <ryanho(at)pacific(dot)net(dot)sg> |
---|---|
To: | pgsql-novice(at)postgresql(dot)org |
Subject: | Optimizing complex queries |
Date: | 2001-06-13 05:18:54 |
Message-ID: | 5.0.2.1.0.20010613131235.00b13610@pacific.net.sg |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-novice |
Hi all,
I've got to set up a program that automatically does searches for up to
10000 members (batched once a week at night), and then email the results of
those searches to them. The search query is rather complex, requiring up to
5 subqueries, close to 10 tables, many joins, and some full-text searches.
I've done the indexing, and has changed the run-time variables of
postmaster (i.e. sort-mem and number of buffers), and vacuumed the database
regularly. Yet, it's still taking a couple of hours on a PIII 850 server
with 512MB RAM.
I've read that writing the search as a stored procedure improves
performance. If so, how significant is it and what affects the efficiency
gains? Also, I think full-text indexing will help, but it's not supported
in postgres natively right?
Thanks in advance
Ryan
From | Date | Subject | |
---|---|---|---|
Next Message | Michael | 2001-06-15 06:39:23 | configuring question? |
Previous Message | Rob Brown-Bayliss | 2001-06-13 03:43:25 | Re: Multiple Columns Keys - Good or Bad idea? |