How is UTF8, UTF16 and UTF32 encoded?

From: Theodore H. Smith (delete@softhome.net)
Date: Wed May 29 2002 - 08:11:35 EDT


I need to know exactly how UTF8, UTF16 and UTF32 is encoded. I heard
that UTF32 can have surrogates, so I can't just expect them
to be scalar values.

Having a nice detailed and clear explanation would help, with
plenty of examples and effects of the encoding and all kinds of
things to make it easier to understand would help.

Or perhaps I'm just reacting to the confusion of the UniCode
website and its not that hard to understand and a simple definition
would do? But the first idea certainly wouldn't hurt.

--
     Theodore H. Smith - Macintosh Consultant / Contractor.
     My website: <www.elfdata.com/>



This archive was generated by hypermail 2.1.2 : Wed May 29 2002 - 12:57:24 EDT