Re: Hexadecimal

From: Adam Twardoch (list.adam@twardoch.com)
Date: Sat Aug 16 2003 - 21:08:18 EDT

  • Next message: Doug Ewell: "Re: Hexadecimal"

    From: "Peter Kirk" <peter.r.kirk@ntlworld.com>
    > But I am not suggesting that this problem is sufficiently serious to
    > justify encoding a new set of hex digits.

    The idea to separately encode digits used to express numbers in base
    notation other than decimal is insane.

    Think of it the following way: numbers in base 10 notation are expressed
    used decimal digits (0123456789). Think of European decimal digits as a
    separate writing system here. Notations with base other than 10 don't have
    "native" digits so they "borrow" symbols from other writing system. This
    _happens_ to be European decimal digits and Roman letters purely by
    convention.

    Numbers expressed in a base 16 (hexadecimal) notation "borrow" the symbols
    0123456789 from the European decimal digits set and the symbols ABCDEF from
    the Roman alphabet. Numbers expressed in a base 8 (octal) notation "borrow"
    the symbols 01234567 from the European decimal digits set. Numbers expressed
    in a base 24 notation "borrow" the symbols 0123456789 from the European
    decimal digits set and the symbols ABCDEFGHIJKLMN from the Roman alphabet.

    If you are to encode the "hexadecimal digits", you should encode the entire
    set "0123456789ABCDEF", not only "ABCDEF". Because you could say that
    "semantically" (although I think this word is overused in Unicode
    discussions) the symbols 0123456789 used for denoting base 16 numbers are
    "other" than the symbols used for denoting base 10 symbols.

    But then, you should also remember to encode separate digit sets for base 5,
    base 9, base 13, base 17, base 20, base 75 and who knows how many other
    numeric notations. Because there is nothing special in "hexadecimal
    numbers" -- they're no different than base 17 or base 22 numbers.

    Seriously: since no writing system "natively" uses hexadecimal digits
    (except for a bunch of crazy programmers), there is no reason encoding them.
    As I mentioned beofre, in international mathematrics, they are represented
    using symbols borrowed from other writing systems and there are no separate
    symbols that would need to be encoded.

    Regards,

    --
    Adam Twardoch   adam{at}twardoch{dot}com   http://www.twardoch.com/adam
    Central European typography, type design, OpenType font engineering.
    MyFonts.com & Linotype typographic consultant. FontLab forum manager.
    Board member and country delegate for Poland and ATypI.
    


    This archive was generated by hypermail 2.1.5 : Sat Aug 16 2003 - 21:43:30 EDT