**From:** Philippe Verdy (*verdy_p@wanadoo.fr*)

**Date:** Tue Nov 11 2003 - 18:03:26 EST

**Previous message:**jameskass@att.net: "Re: Hexadecimal digits?"**In reply to:**Kenneth Whistler: "RE: Hexadecimal digits?"**Next in thread:**jameskass@att.net: "Re: Hexadecimal digits?"**Messages sorted by:**[ date ] [ thread ] [ subject ] [ author ] [ attachment ]**Mail actions:**[ respond to this message ] [ mail a new topic ]

From: "Kenneth Whistler" <kenw@sybase.com>

To: <Jill.Ramonsky@Aculab.com>

Cc: <unicode@unicode.org>; <kenw@sybase.com>

Sent: Tuesday, November 11, 2003 8:55 PM

Subject: RE: Hexadecimal digits?

*> Jill Ramonsky summarized:
*

*>
*

*> > In summary then, suggestions which seem to cause considerably less
*

*> > objection than the Ricardo Cancho Niemietz proposal are:
*

*> > (1) Invent a new DIGIT COMBINING LIGATURE character, which allows you to
*

*> > construct any digit short of infinity
*

*> > (2) Use ZWJ for the same purpose
*

*> > (3) Invent two new characters BEGIN NUMERIC and END NUMERIC which force
*

*> > reinterpretation of intervening letters as digits
*

*>
*

*> Actually, I don't think these cause considerably less objection.
*

*> They are simply suggestions which Philippe has made, and which
*

*> haven't been potshot sufficiently yet on the list.
*

*>
*

*> I note that Philippe and you *have* reached consensus that you are
*

*> talking about extending the list of digits, and are not concerned
*

*> about Ricardo Cancho Niemietz's issue of fixed-width digit display.
*

The only consensus is that the RCN proposal does not fix any issue.

When I suggested something else, it was within the given context of natural

sort or mathematical representation of numbers into sequences of digits. But

I don't like that idea which is basically flawed: why limiting us to only

radix 16 ?

After all, we work everyday with radix-1000 numbers, when we use grouping

separators (or we spell vocally the numbers). The way this separator is

written is dependant on cultural conventions (dots, commas, spaces). We even

have cultures that group digits with other radixes in mind than 1000 (and

this is naturally translated in the spoken words of numbers in these

languages)

Counting with base-10 is only the last step of an evolution based on

anthropomorphic counters (most often the fingers in hands), so it is not

surprising that people have counted on the past using different gestures,

before finding a way to write them down in a scripted form.

But at least, even in that case, the set of digits that have been used has

been limited by the number of easy and fast to reproduce and recognize

gestures. So this anthropomorphic arithmetic is necessarily limited, unlike

in mathematics where radixes are unlimited.

For the case of hexadecimal, it is very uncommon as it does not correspond

to any anthropomorphic or natural measure. This means that we have no

admited cultural conventions that allow us to have spelling names for the

numbers formed using base-16 digits. This remains a system useful when

talking to computers which have well-defined and limited storage units. This

is, as somebody noticed before, only a handy shortcut to represent a state

of a finite-state automata. By itself, hexadecimal is not a counting system,

even when we use A-F glyph shortcuts to represent its digits.

So I would separate the use of hexadecimal in computers with binary logic

from the general case needed for mathemetics. If we speak of mathematics, it

is full of freely invented notations based (most of the time) on existing

symbols. Why couldn't the existing Unicode character set satisfy the needs

of mathematicians?

When I look at arithmetics books speaking about numbers with various radix,

the notation of numbers is quite inventive but most often reuse existing

glyphs with a presentation form, or with punctuation conventions. A number

written like ³¹²⁴6¹⁴0¹²³⁴7²³9 will be unambiguous for a mathematician used

to a convention where the difference between leading high-order subdigits

and low order subdigits is marked by font size or positioning. This is

markup, but it is still equivalent to using a punctuation as (31246, 140,

12347, 239), or a vertical vectorial representation, or an expression using

ordered series with indices marking the implied radix.

In summary, within maths, we currently dont need specific marks to express

numbers with any radix>10 (this is basic arithmetic and studied since

antiquity, and until then we did not need specific symbols to denote

arbitrary digits in a radix-N arithmetic, but we have always used existing

digits in one of our natural counting systems with additional markup and

symbols to separate these virtual digits, plus varying conventions to use

them, such as the notation of hours in a base-60 system)

So I wonder why we would ever need digits A to F? The only good reason would

be that we start using it as our prefered (cultural) counting system, within

a language that can now safely name the numbers forms with them. Even in

that case, I really doubt such language would keep the letters A to F as

good glyphs to represent these digits.

Such language or culture with binary-based counting may already exist (or

have existed) on earth. But if people were writing numbers within that

cultural context, they have used another script. They have not used our

common Arabo-European decimal digits (now coded in ASCII)...

So it seems easier to do nothing for now: wait and see, but don't add digits

in Unicode as long as people are not using them within their natural

language: for them our binary machines will become very user-friendly, and

out decimal counting system will look quite complex or tricky to use.

However we have the same feeling face to cultures using other radices for

some frequent and useful operations like payments (do you remember the

Victorian British system of accounting units?).

But just look around you, and you'll find many products counted or sold by

units of dozens (a very long tradition which predates the adoption of the

decimal system). Do we still need digits to represent ten and eleven

separately? No... Were there glyphs for these two digits in the past?

Possible, but they have proven not being useful, as nearly nobody knows

them...

**Next message:**Kent Karlsson: "RE: Hexadecimal digits?"**Previous message:**jameskass@att.net: "Re: Hexadecimal digits?"**In reply to:**Kenneth Whistler: "RE: Hexadecimal digits?"**Next in thread:**jameskass@att.net: "Re: Hexadecimal digits?"**Messages sorted by:**[ date ] [ thread ] [ subject ] [ author ] [ attachment ]**Mail actions:**[ respond to this message ] [ mail a new topic ]

*
This archive was generated by hypermail 2.1.5
: Tue Nov 11 2003 - 18:55:00 EST
*