| From: | Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us> |
|---|---|
| To: | Alastair McKinley <a(dot)mckinley(at)analyticsengines(dot)com> |
| Cc: | Devrim Gündüz <devrim(at)gunduz(dot)org>, PostgreSQL Hackers <pgsql-hackers(at)lists(dot)postgresql(dot)org> |
| Subject: | Re: CUBE_MAX_DIM |
| Date: | 2020-06-25 16:43:12 |
| Message-ID: | 2277294.1593103392@sss.pgh.pa.us |
| Views: | Whole Thread | Raw Message | Download mbox | Resend email |
| Thread: | |
| Lists: | pgsql-hackers |
Alastair McKinley <a(dot)mckinley(at)analyticsengines(dot)com> writes:
> I know that Cube in it's current form isn't suitable for nearest-neighbour searching these vectors in their raw form (I have tried recompilation with higher CUBE_MAX_DIM myself), but conceptually kNN GiST searches using Cubes can be useful for these applications. There are other pre-processing techniques that can be used to improved the speed of the search, but it still ends up with a kNN search in a high-ish dimensional space.
Is there a way to fix the numerical instability involved? If we could do
that, then we'd definitely have a use-case justifying the work to make
cube toastable.
regards, tom lane
| From | Date | Subject | |
|---|---|---|---|
| Next Message | Alvaro Herrera | 2020-06-25 16:52:32 | Re: Weird failures on lorikeet |
| Previous Message | Jeff Davis | 2020-06-25 16:42:33 | Re: Default setting for enable_hashagg_disk |