From: | Peter Pilsl <pilsl(at)goldfisch(dot)at> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | query on large tables |
Date: | 2001-09-01 22:19:47 |
Message-ID: | 20010902001947.A86425@i3.atat.at |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
I've a query on a large table. The table consists of approx. 100.000
entries and the where-clause checks 4 chars against a 100-char-text in
the table. So this is hard work and the query takes about 4 seconds on
my system. This is quite ok, but the problem is, I want to prepare the
result for human readers and therefore split in several pages. So I
first need to query once to get the number of results and based on
this number I create a navigation-bar and construct a limit-operator.
With this limit-operator I query a second time to get and display the
entries.
In fact I ask postgres the same query two times (difference is only
the limit-section) and this takes 8 instead of 4 seconds.
Any way to do better ?
thnx,
peter
--
mag. peter pilsl
phone: +43 676 3574035
fax : +43 676 3546512
email: pilsl(at)goldfisch(dot)at
sms : pilsl(at)max(dot)mail(dot)at
pgp-key available
From | Date | Subject | |
---|---|---|---|
Next Message | Gunnar Rønning | 2001-09-02 01:37:33 | Re: PL/java? |
Previous Message | booli2 | 2001-09-01 20:37:11 | About system tables |