Re: numeric/decimal docs bug?

From: Bruce Momjian <pgman(at)candle(dot)pha(dot)pa(dot)us>
To: Jan Wieck <janwieck(at)yahoo(dot)com>
Cc: Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us>, Peter Eisentraut <peter_e(at)gmx(dot)net>, Tatsuo Ishii <t-ishii(at)sra(dot)co(dot)jp>, pgsql-hackers(at)postgresql(dot)org, Jan Wieck <JanWieck(at)yahoo(dot)com>
Subject: Re: numeric/decimal docs bug?
Date: 2002-04-11 21:39:26
Message-ID: 200204112139.g3BLdRQ08110@candle.pha.pa.us
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-hackers

Jan Wieck wrote:
> Bruce Momjian wrote:
> > Jan Wieck wrote:
> > > > The hard limit is certainly no more than 64K, since we store these
> > > > numbers in half of an atttypmod. In practice I suspect the limit may
> > > > be less; Jan would be more likely to remember...
> > >
> > > It is arbitrary of course. I don't recall completely, have to
> > > dig into the code, but there might be some side effect when
> > > mucking with it.
> > >
> > > The NUMERIC code increases the actual internal precision when
> > > doing multiply and divide, what happens a gazillion times
> > > when doing higher functions like trigonometry. I think there
> > > was some connection between the max precision and how high
> > > this internal precision can grow, so increasing the precision
> > > might affect the computational performance of such higher
> > > functions significantly.
> >
> > Oh, interesting, maybe we should just leave it alone.
>
> As said, I have to look at the code. I'm pretty sure that it
> currently will not use hundreds of digits internally if you
> use only a few digits in your schema. So changing it isn't
> that dangerous.
>
> But who's going to write and run a regression test, ensuring
> that the new high limit can really be supported. I didn't
> even run the numeric_big test lately, which tests with 500
> digits precision at least ... and therefore takes some time
> (yawn). Increasing the number of digits used you first have
> to have some other tool to generate the test data (I
> originally used bc(1) with some scripts). Based on that we
> still claim that our system deals correctly with up to 1,000
> digits precision.
>
> I don't like the idea of bumping up that number to some
> higher nonsense, claiming we support 32K digits precision on
> exact numeric, and noone ever tested if natural log really
> returns it's result in that precision instead of a 30,000
> digit precise approximation.
>
> I missed some of the discussion, because I considered the
> 1,000 digits already beeing complete nonsense and dropped the
> thread. So could someone please enlighten me what the real
> reason for increasing our precision is? AFAIR it had
> something to do with the docs. If it's just because the docs
> and the code aren't in sync, I'd vote for changing the docs.

I have done a little more research on this. If you create a numeric
with no precision:

CREATE TABLE test (x numeric);

You can insert numerics that are greater in length that 1000 digits:

INSERT INTO test values ('1111(continues 1010 times)');

You can even do computations on it:

SELECT x+1 FROM test;

1000 is pretty arbitrary. If we can handle 1000, I can't see how larger
values somehow could fail.

Also, the numeric regression tests takes much longer than the other
tests. I don't see why a test of that length is required, compared to
the other tests. Probably time to pair it back a little.

--
Bruce Momjian | http://candle.pha.pa.us
pgman(at)candle(dot)pha(dot)pa(dot)us | (610) 853-3000
+ If your life is a hard drive, | 830 Blythe Avenue
+ Christ can be your backup. | Drexel Hill, Pennsylvania 19026

In response to

Responses

Browse pgsql-hackers by date

  From Date Subject
Next Message Peter Eisentraut 2002-04-11 22:01:55 Re: RFC: Restructuring pg_aggregate
Previous Message Tom Lane 2002-04-11 21:26:41 Re: RFC: Restructuring pg_aggregate