From: | Alex <alex(at)liivid(dot)com> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: Full text index not being used |
Date: | 2009-02-01 16:32:58 |
Message-ID: | a7401703-36d3-4ba9-adc7-0ee17e1a0ff6@a39g2000prl.googlegroups.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
So this seems to be because the result size is too big. I still don't
know why it is looping through every record and printing a warning,
but adding a LIMIT makes the queries complete in a reasonable time
(although not all that fast).
However I need to sort and also have many other facets that may or may
not be included in the query. Adding a sort makes it load every
record again and take forever.
I tried to create an index including all of the fields I query on to
see if that would work, but I get an error the the index row is too
large:
=> create index master_index on source_listings(geo_lat, geo_lon,
price, bedrooms, region, city, listing_type, to_tsvector('english',
full_listing), post_time);
NOTICE: word is too long to be indexed
DETAIL: Words longer than 2047 characters are ignored.
NOTICE: word is too long to be indexed
DETAIL: Words longer than 2047 characters are ignored.
NOTICE: word is too long to be indexed
DETAIL: Words longer than 2047 characters are ignored.
NOTICE: word is too long to be indexed
DETAIL: Words longer than 2047 characters are ignored.
ERROR: index row requires 13356 bytes, maximum size is 8191
Any ideas about how to resolve this?
From | Date | Subject | |
---|---|---|---|
Next Message | Scott Marlowe | 2009-02-01 16:44:41 | Re: PGSQL or other DB? |
Previous Message | Alex Neth | 2009-02-01 15:56:17 | Full text index not being used, even though it is in the plan |