From: | Heikki Linnakangas <heikki(at)enterprisedb(dot)com> |
---|---|
To: | Henrik Zagerholm <henke(at)mac(dot)se> |
Cc: | Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us>, pgsql-performance(at)postgresql(dot)org |
Subject: | Re: Extreme slow select query 8.2.4 |
Date: | 2007-08-06 19:47:22 |
Message-ID: | 46B77ACA.9010302@enterprisedb.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-performance |
Henrik Zagerholm wrote:
> I know the query retrieves way more which is really necessary to show to
> the user so I would gladly come up with a way to limit the query so the
> GUI doesn't hang for several minutes if a user does a bad search.
> The problem is that I don't know a good way of limit the search
> efficiently as only going on tbl_file with limit 100 could make the
> query only to return 10 rows if the user doesn't have access to 900 of
> the files (This is what the join with tbl_acl does). Using cursors
> doesn't help because I really don't retrieve that much data
Could you just add a LIMIT 100 to the end of the query, if 100 rows is
enough? That would cut the runtime of the query, if there's a quicker
plan to retrieve just those 100 rows.
Another alternative is to use statement_timeout. If a query takes longer
than specified timeout, it's automatically aborted and an error is given.
--
Heikki Linnakangas
EnterpriseDB http://www.enterprisedb.com
From | Date | Subject | |
---|---|---|---|
Next Message | Henrik Zagerholm | 2007-08-06 20:16:55 | Re: Extreme slow select query 8.2.4 |
Previous Message | Tom Lane | 2007-08-06 18:46:14 | Re: TRUNCATE TABLE |