From: | Jose Ildefonso Camargo Tolosa <ildefonso(dot)camargo(at)gmail(dot)com> |
---|---|
To: | pgsql-sql(at)postgresql(dot)org |
Subject: | Question about POSIX Regular Expressions performance on large dataset. |
Date: | 2010-08-18 02:21:25 |
Message-ID: | AANLkTi=0Qy1+tB-qzAH5oZhO5DVxyMptt90oE_FjSEwA@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-sql |
Hi!
I'm analyzing the possibility of using PostgreSQL to store a huge
amount of data (around 1000M records, or so....), and these, even
though are short (each record just have a timestamp, and a string that
is less than 128 characters in length), the strings will be matched
against POSIX Regular Expressions (different regexps, and maybe
complex).
Because I don't have a system large enough to test this here, I have
to ask you (I may borrow a medium-size server, but it would take a
week or more, so I decided to ask here first). How is the performance
of Regexp matching in PostgreSQL? Can it use indexes? My guess is:
no, because I don't see a way of generally indexing to match regexp :(
, so, tablescans for this huge dataset.....
What do you think of this?
Sincerely,
Ildefonso Camargo
From | Date | Subject | |
---|---|---|---|
Next Message | Scott Marlowe | 2010-08-18 02:28:09 | Re: Question about POSIX Regular Expressions performance on large dataset. |
Previous Message | Peter Koczan | 2010-08-17 21:06:39 | Re: Domains, casts, and MS Access |