From: | "Merlin Moncure" <mmoncure(at)gmail(dot)com> |
---|---|
To: | "Dan Harris" <fbsd(at)drivefaster(dot)net> |
Cc: | "PostgreSQL Performance" <pgsql-performance(at)postgresql(dot)org> |
Subject: | Re: Help optimizing a slow index scan |
Date: | 2006-03-17 16:56:03 |
Message-ID: | b42b73150603170856j588b268bjd89ce769f4ac7a28@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-performance |
On 3/16/06, Dan Harris <fbsd(at)drivefaster(dot)net> wrote:
> explain analyze
> select distinct eventmain.incidentid, eventmain.entrydate,
> eventgeo.long, eventgeo.lat, eventgeo.geox, eventgeo.geoy
> from eventmain, eventgeo
> where
> eventmain.incidentid = eventgeo.incidentid and
> ( long > -104.998027962962 and long < -104.985957781349 ) and
> ( lat > 39.7075542720006 and lat < 39.7186195832938 ) and
> eventmain.entrydate > '2006-1-1 00:00' and
> eventmain.entrydate <= '2006-3-17 00:00'
> order by
> eventmain.entrydate;
As others will probably mention, effective queries on lot/long which
is a spatial problem will require r-tree or gist. I don't have a lot
of experience with exotic indexes but this may be the way to go.
One easy optimization to consider making is to make an index on either
(incidentid, entrydate) or (incident_id,long) which ever is more
selective.
This is 'yet another query' that would be fun to try out and tweak
using the 8.2 upcoming row-wise comparison.
merlin
From | Date | Subject | |
---|---|---|---|
Next Message | Steve Atkins | 2006-03-17 16:56:58 | Re: Background writer configuration |
Previous Message | Luke Lonergan | 2006-03-17 15:59:16 | Re: 1 TB of memory |