| From: | Paul Jungwirth <pj(at)illuminatedcomputing(dot)com> |
|---|---|
| To: | pgsql-general(at)postgresql(dot)org |
| Subject: | Re: Success story full text search |
| Date: | 2015-05-02 15:55:22 |
| Message-ID: | 5544F36A.5040707@illuminatedcomputing.com |
| Views: | Whole Thread | Raw Message | Download mbox | Resend email |
| Thread: | |
| Lists: | pgsql-general |
> Does someone have a success story of using Postgres Full Search
> Capability with significant data, lets say > 50-100 GB ?
This is a recent and very complete article on using Postgres for
full-text search:
http://blog.lostpropertyhq.com/postgres-full-text-search-is-good-enough/
see also the discussion here:
https://news.ycombinator.com/item?id=8381748
https://news.ycombinator.com/item?id=8714477
That should give you a good sense of the abilities and limitations vs
using Lucene etc.
On scanning that article I don't see any mention of size, but you could
always ask the author!
Paul
| From | Date | Subject | |
|---|---|---|---|
| Next Message | Yves Dorfsman | 2015-05-02 16:53:47 | plpgsql functions organisation |
| Previous Message | Melvin Davidson | 2015-05-02 15:03:27 | Re: [GENERAL] How to exclude blobs (large objects) from being loaded by pg_restore? |