Re: IPA Null Consonant

From: Thomas M. Widmann (thomas@widmann.uklinux.net)
Date: Tue May 27 2003 - 17:44:06 EDT

  • Next message: Philippe Verdy: "Re: javascript and unicode"

    Peter_Constable@sil.org writes:

    > > >Yes, I think you're right that an annotation is best -- but only
    > > >if EMPTY SET is indeed the right character. I'm increasingly of
    > > >the opinion that a different character might be needed.
    > >
    > > I would disagree.
    >
    > As would I.

    Oh dear, if you both disagree with me, my chances of getting through
    with this look slim indeed... :-)

    But I'm wondering why.

    I think we all agree on the following:

    - Ø [LATIN CAPITAL LETTER O WITH STROKE] and ø [LATIN SMALL LETTER O
      WITH STROKE] are both ruled out as their semantics is totally wrong.

    - 0 [DIGIT ZERO] is also ruled out because it looks wrong in most
      fonts (and one might argue that the semantics isn't exactly right,
      either).

    - ∅ [EMPTY SET] is the best choice if a single character has to be
      chosen from the current Unicode repertoire.

    - But while ∅ [EMPTY SET] is normally just as wide as it is tall (it's
      really just a circle with a stroke), the null symbol as used in
      linguistics frequently looks more like 0 [DIGIT ZERO] with an added
      stroke. (But many variations exist, including ∅ [EMPTY SET], ø
      [LATIN SMALL LETTER O WITH STROKE] and other symbols, most of which
      can be explained by typesetters and word-processing programs that
      didn't know what they're doing.)

    - Furthermore, semantically an empty set is not really the same thing
      as a null symbol. (They both represent 'nothing', but so does 0
      [DIGIT ZERO] and possibly other Unicode characters as well.)

    - However, 0 [DIGIT ZERO] + ̸ [COMBINING LONG SOLIDUS OVERLAY] --
      which is close to how linguists used to type it in the old
      typewriter days -- is also a bad idea since it would look bad in
      most programs.

    If you agree with all of the above, I'm wondering what the argument is
    against a new Unicode character, called NULL or NULL SYMBOL. Surely
    if it looks different from any existing character and has a
    well-defining meaning also not covered, there must be a good case for
    adding it...?

    Cheers,

    Thomas

    -- 
    Thomas Widmann, MA      +44  141 419 9872       Glasgow, Scotland, EU
    thomas@widmann.uklinux.net             http://www.widmann.uklinux.net
    


    This archive was generated by hypermail 2.1.5 : Tue May 27 2003 - 18:28:51 EDT