Re: Origin of the U+nnnn notation

From: Hans Aberg (haberg@math.su.se)
Date: Tue Nov 08 2005 - 12:58:14 CST

  • Next message: Kenneth Whistler: "Re: Origin of the U+nnnn notation"

    On 8 Nov 2005, at 15:04, Philippe Verdy wrote:

    > There are '''no''' negative codepoints in either standards (U-0001
    > does not designate the 32-bit code unit that you could store in a
    > signed wide-char datatype, but in past standard it designated the
    > same codepoint as U+0001 now). Using "+" makes the statement about
    > signs clear: standard code points all have positive values.

    What about U+0000, a non-positive value? :-)

       Hans Aberg



    This archive was generated by hypermail 2.1.5 : Tue Nov 08 2005 - 14:57:56 CST