Re: UTF-16 inside UTF-8

From: John Cowan (
Date: Thu Nov 06 2003 - 07:54:08 EST

  • Next message: Peter Kirk: "Re: [hebrew] Re: Hebrew composition model, with cantillation marks"

    Doug Ewell scripsit:

    > To cite a non-Unicode example, in ECMAScript (née JavaScript) there is a
    > function Date.GetYear() that was intended to return the last two digits
    > of the year but actually returned the year minus 1900. Of course,
    > starting in 2000 the function returned a value which was useful to
    > practically nobody.

    How, not useful? C programmers have been dealing with (year - 1900)
    since the 70s, and it is now 103. :-)

    > Did Sun or ECMA change the definition of
    > Date.GetYear()? No, they introduced a new function, Date.GetFullYear(),
    > which does what users really want.

    I wonder why they bothered, since it can be defined in a single line of

    Now if GetYear() had indeed returned the last two digits, that would have
    been annoying, since it could be used only for presentation, and in
    order to get the actual year, one would have to impose an arbitrary
    heuristic to map the 2-digit value to a year number.

    Only do what only you can do.           John Cowan <>
      --Edsger W. Dijkstra,       
        deceased 6 August 2002    

    This archive was generated by hypermail 2.1.5 : Thu Nov 06 2003 - 08:36:53 EST