RE: Definitions

From: jameskass@att.net
Date: Tue Nov 18 2003 - 21:54:46 EST

  • Next message: Theodore H. Smith: "Re: Ternary search trees for Unicode dictionaries"

    .
    Peter Constable wrote,

    > So, we need to decide: are we going to debate what follows the letter of
    > the conformance laws, or what is useful?

    Since we seem to agree on what is useful, it might be difficult
    to debate.

    (I had thought we were debating what is the letter of the conformance
    laws and drifting a bit into what should be the letter of the
    conformance laws.)

    Earlier, Peter Constable wrote,

    > It is perfectly acceptable for a conformant application to use every
    > single PUA codepoint for its own internal purposes, and to reject
    > incoming PUA codepoints or display them with some default "not
    > supported" glyph. ...

    And, I'd made some kind of response. Here's another try at it...

    Isn't it acceptable for any application to use any byte sequence
    internally?

    Inside a program, for instance:

         a = 0

    The letter "a" can be used as a memory variable. The programmer has
    just re-assigned its value internally. It is no longer the letter "a", it
    is now the number zero.

    This variable might be used as a counter for a loop, or, whatever.

    Later in the program, there could be an opportunity for the user
    to enter a choice with the keyboard. The screen could look
    something like:

    ***

         Please enter one of the following choices:

         "a" = Sarasvati gets to run the e-mail list as she pleases.

          - or -

         "b" = Sarasvati gets to run the e-mail list as she pleases.

         Please type the letter "a" or "b": | |

    Press any key to continue...

    ***

    It isn't necessary to release the internal memory variable "a" in order for
    the user to externally indicate a choice by typing in the letter "a" on the
    keyboard.

    If the memory variable "a" still equals zero, we don't expect the zero
    character to display on the screen when the user enters the "a" letter.
    We shouldn't expect some kind of default glyph meaning "this code
    point is being used internally, so you can't have it."

    If the program calls for the display of an external text file, the letter
    "a" really needs to appear where it's expected.

    Otherwise, I'd not consider the program to be "ASCII-conformant".

    Likewise, any application claiming Unicode conformance which mungs
    text or distorts displays for any valid Unicode range...
    well, you know where I'm going with this....

    Best regards,

    James Kass
    .



    This archive was generated by hypermail 2.1.5 : Tue Nov 18 2003 - 22:50:18 EST