Re: Perception that Unicode is 16-bit (was: Re: Surrogate space in

From: DougEwell2@cs.com
Date: Wed Feb 21 2001 - 00:48:05 EST


I wrote:

>> Even 8-bit ASCII is a correct term meaning ISO-8859-1.
>
> I would question that. Understandable, yes, but not really correct.

jcowan@reutershealth.com wrote:

> No, it *is* correct. ANSI X.3 (which has a new name these days) in fact
> did define an 8-bit American Standard Code for Information Interchange,
> being exactly the same as ISO 8859-1.
>
> Of course, that does not affect the definition of the 7-bit American
> Standard Code.

Meanwhile, roozbeh@sharif.edu wrote:

> In the computer culture I grew up, 8-bit ASCII meant CP437. Every author
> called the CP437 table that was available at the end of computer books the
> ASCII table.

And perhaps the Mac people think of MacRoman as "8-bit ASCII." The 8-bit
extensions to ASCII are just that, extensions -- they are not ASCII. Even
ISO 8859-1 cannot be called "the" 8-bit ASCII -- if it were, there would be
no need for ISO 8859-2, -3, -4, -5, etc.

Of course, it could be worse. Ten years ago, one of the WordPerfect experts
at my work had a name for all those strange Greek, Cyrillic, box-drawing and
happy-face characters that were listed in Appendix Z of the manual and had to
be entered with a special {x, y} key sequence. She called them "the ASCII
characters."

-Doug Ewell
 Fullerton, California



This archive was generated by hypermail 2.1.2 : Tue Jul 10 2001 - 17:21:19 EDT