From: | "Guillaume Smet" <guillaume(dot)smet(at)gmail(dot)com> |
---|---|
To: | cedric <cedric(dot)villemain(at)dalibo(dot)com> |
Cc: | pgsql-performance(at)postgresql(dot)org, "Richard Huxton" <dev(at)archonet(dot)com>, valgog <valgog(at)gmail(dot)com> |
Subject: | Re: Key/Value reference table generation: INSERT/UPDATE performance |
Date: | 2007-05-23 08:04:24 |
Message-ID: | 1d4e0c10705230104o30db1cf6x86207a96b23f823a@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-performance |
On 5/22/07, cedric <cedric(dot)villemain(at)dalibo(dot)com> wrote:
> I made something very similar, and using PL/pgsql is very slow, when using
> perl is very quick.
Another solution is to use tsearch2 for that:
CREATE TABLE word_counts AS SELECT * FROM stat('SELECT
to_tsvector(''simple'', lower(coalesce(field containing words, '''')))
FROM your table');
I don't know if the fact you have an array of words is a must have or
just a design choice. If you have to keep that, you can transform the
array easily into a string with array_to_string and use the same sort
of query.
I don't know what are exactly your speed requirements but it's quite
fast here. If you drop your table and recreate it into a transaction,
it should work like a charm (or you can use TRUNCATE and INSERT INTO).
--
Guillaume
From | Date | Subject | |
---|---|---|---|
Next Message | Gregory Stark | 2007-05-23 08:31:26 | Re: Postgres Benchmark Results |
Previous Message | Andreas Kostyrka | 2007-05-23 06:32:25 | Re: Tips & Tricks for validating hardware/os |