Currently, CHAR is correctly interpreted as CHAR(1), but VARCHAR is
incorrectly interpreted as VARCHAR(<infinity>). Any reason for that,
besides the fact that it of course makes much more sense than VARCHAR(1)?
Additionally, neither CHAR nor VARCHAR seem to bark on too long input,
they just truncate silently.
I'm wondering because should the bit types be made to imitate this
incorrect behaviour, or should they start out correctly?
--
Peter Eisentraut peter_e(at)gmx(dot)net http://yi.org/peter-e/