* John Cowan
| C1 says "A process shall interpret Unicode code values as 16-bit
This I find mightily confusing. Why say something like this when
there are (well, will be) characters that cannot be represented with
16 bits in any of the Unicode encodings?
| "Code unit" is defined in definition D5 as a synonym for "code
| value". If this needs updating,
Unless I've really misunderstood something it does need updating.
This archive was generated by hypermail 2.1.2 : Tue Jul 10 2001 - 17:21:06 EDT