From: | Michel Pelletier <pelletier(dot)michel(at)gmail(dot)com> |
---|---|
To: | Jess Wren <jess(dot)wren(at)interference(dot)cc> |
Cc: | Arthur Zakirov <a(dot)zakirov(at)postgrespro(dot)ru>, pgsql-general <pgsql-general(at)lists(dot)postgresql(dot)org> |
Subject: | Re: How to use full-text search URL parser to filter query results by domain name? |
Date: | 2019-04-10 18:55:50 |
Message-ID: | CACxu=vKr-1TNxmQrGx0uw=baJior2-nr7-X-bNN8VTDyLMGtfg@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On Wed, Apr 10, 2019 at 1:58 AM Jess Wren <jess(dot)wren(at)interference(dot)cc> wrote:
> -> Parallel Seq Scan on links
> (cost=0.00..4554.40 rows=75740 width=112)
>
> -> Function Scan on ts_parse (cost=0.00..12.50 rows=5 width=32)
> Filter: (tokid = 6)
> (23 rows)
>
>
>
> I am wondering if there is a more efficient way to do things? Some people
> on IRC mentioned that it might be better to declare a scalar function to
> return the host from ts_parse instead of the LATERAL query ... but I
> couldn't figure out how to do that, or if it was even preferable to the
> above from a performance standpoint ... any ideas on how I could improve
> the above.
>
May try indexing the parsed expression to avoid the seq scan on links,
something like:
create index on links (ts_parse('default', target));
and then run the explain (or explain analyze) to see if that improves
things. Certainly as the links table gets bigger this should help.
From | Date | Subject | |
---|---|---|---|
Next Message | Justin Pryzby | 2019-04-10 19:03:26 | both force order of evaluation and hit index |
Previous Message | Rob Sargent | 2019-04-10 18:12:13 | Re: stale WAL files? |