Re: Perception that Unicode is 16-bit (was: Re: Surrogate space in

From: John Cowan (
Date: Tue Feb 20 2001 - 14:18:05 EST wrote:

>> Even 8-bit ASCII is a correct term meaning ISO-8859-1.
> I would question that. Understandable, yes, but not really correct.

No, it *is* correct. ANSI X.3 (which has a new name these days) in fact
did define an 8-bit American Standard Code for Information Interchange,
being exactly the same as ISO 8859-1.

Of course, that does not affect the definition of the 7-bit American
Standard Code.

> I thought "Roman" was simply an alternate word for "Latin," but Jorg is
> correct. This is also an error.

It's too strong to call it an error.

There is / one art             || John Cowan <>
no more / no less              ||
to do / all things             ||
with art- / lessness           \\ -- Piet Hein

This archive was generated by hypermail 2.1.2 : Tue Jul 10 2001 - 17:21:19 EDT