Hmmm. I was hoping this discussion would go away after the initial round
of reasons why it won't happen.
> The problem being solved is properly supporting the base sixteen system.
It is already properly supported. In fact, Unicode contains far more than
a mere 16 entities sufficient for hexadecimal. With Unicode, any number
base up to about 94,000 can easily be represented. It should satisfy even
the hippest numerologists.
> Also using letters to stand for numeric data can lead to confusion--
> this is BAD.
Perhaps, but it's like spelling versus spelling reform. The current
representation is already engrained in the computer-literate culture, and
you'll be hard-pressed to change it, especially without a compelling story.
And this story isn't very compelling.
> it is better to have 16 in a row as it makes computation of a numeric
> value from the character value easier and more straightforward.
So what? This isn't rocket science. The hex-binary conversion problem is
so trivial that every beginning CS student has probably had a homework
assignement to solve it. Big deal. Five lines of library code.
> ...But then you lose the unambiguousness of sixteen separate characters.
We already have done: because everybody already uses 0-9, a-f and A-F, and
there's tons of software that already deals with this and mounds of
existing data. The problem won't be solved, it will be augmented with yet
The proposal is a non-starter. There isn't even a glimmer of serious
interest here, and it's rather pointless to continue this discussion.
This archive was generated by hypermail 2.1.2 : Thu Jun 20 2002 - 18:02:20 EDT