From: John H. Jenkins (email@example.com)
Date: Mon Jun 29 2009 - 10:28:08 CDT
¦b Jun 27, 2009 10:58 AM ®É¡A Doug Ewell ¼g¨ì¡G
> If you want to process *any arbitrary sequence of Unicode
> characters* as a string, then you may have problems with U+0000 --
> but that would have been true if you wanted to process any arbitrary
> sequence of bytes as an ASCII string.
In such a case, one usually uses U+FFFF, which is guaranteed not to be
a valid Unicode code point.
John H. Jenkins
This archive was generated by hypermail 2.1.5 : Mon Jun 29 2009 - 10:32:01 CDT