Re: Perception that Unicode is 16-bit (was: Re: Surrogate space i

From: Tom Lord (lord@emf.net)
Date: Thu Feb 22 2001 - 00:44:41 EST


        What exactly _would_ be wrong with calling UNICODE a
        thirty-two bit encoding

If I have a 32 bit integer type, holding a Unicode code point, I have
11 bits left over to hold other data. That's worth knowing.

        Btw, saying approximately 20.087 bits (Am I calculating that
        right -- log2[ 17*65536 ]?) causes many people to think they
        are just being teased.

They'll get over it.

Thomas Lord
regexps.com



This archive was generated by hypermail 2.1.2 : Tue Jul 10 2001 - 17:21:19 EDT