From: | Ron Mayer <rm_pg(at)cheapcomplexdevices(dot)com> |
---|---|
To: | Euler Taveira de Oliveira <euler(at)timbira(dot)com> |
Cc: | Robert Haas <robertmhaas(at)gmail(dot)com>, "pgsql-hackers(at)postgresql(dot)org" <pgsql-hackers(at)postgresql(dot)org> |
Subject: | Re: explain analyze rows=%.0f |
Date: | 2009-06-02 03:30:49 |
Message-ID: | 4A249CE9.6050708@cheapcomplexdevices.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-hackers |
Euler Taveira de Oliveira wrote:
> Robert Haas escreveu:
>> ...EXPLAIN ANALYZE reports the number of rows as an integer... Any
>> chance we could reconsider this decision? I often find myself wanting
>> to know the value that is here called ntuples, but rounding
>> ntuples/nloops off to the nearest integer loses too much precision.
>>
> Don't you think is too strange having, for example, 6.67 rows? I would confuse
> users and programs that parses the EXPLAIN output. However, I wouldn't object
I don't think it's that confusing. If it says "0.1 rows", I imagine most
people would infer that this means "typically 0, but sometimes 1 or a few" rows.
What I'd find strange about "6.67 rows" in your example is more that on
the estimated rows side, it seems to imply an unrealistically precise estimate
in the same way that "667 rows" would seem unrealistically precise to me.
Maybe rounding to 2 significant digits would reduce confusion?
From | Date | Subject | |
---|---|---|---|
Next Message | Robert Haas | 2009-06-02 03:46:39 | Re: from_collapse_limit vs. geqo_threshold |
Previous Message | Joe Conway | 2009-06-02 03:25:06 | Re: dblink patches for comment |