RE: ASCII as a subset of Unicode: Some history

From: Hohberger, Clive (CHohberger@zebra.com)
Date: Sat Apr 11 2009 - 22:12:11 CDT

  • Next message: Hans Aberg: "Re: ASCII as a subset of Unicode (was: Re: Oxford proposes a leaner alphabet)"

    Jonathan Rosenne wrote:

    >> Actually, I was under the impression that ASCII was defined in terms
    >> of 7-bit code units, whereas there are virtually no computers or
    >> users today who think in terms of 7-bit code units.
    >
    > There weren't such computers then, it was a communication code and 7
    > bits were used for communication.

    Jukka Korpela wrote:

    I'm not sure what you mean by "such" here, but in fact, even in the
    1980s and early 1990s, DECsystem-10 and -20 (PDP-10 and -20) used a word
    length of
    36 bits, packing five 7-bit ASCII characters in one word (and using the
    spare bit for special purposes).

    ASCII was surely designed to allow implementations where 7 bits are used
    for one character. Don't confuse this with the current situation where
    such implementations are obsolete and "everyone" uses at least 8 bits
    for a character, even when working with ASCII only.

    ----------
    Jony and Jukka are both correct:

    ASCII was never designed as a computer code: It was an outgrowth of
    telegraphic codes, evolving from 5-bit Baudot developed by Emile Baudot
    in 1870. The thing about a 7-bit code is that it contains enough
    codepoints to support both the upper case and lower case American
    English alphabet, numerics and punctuation. In other words, normal
    American English text.

    So, how did ASCII end up in computers? Its started first in Europe with
    IBM's Bob Bremer's development of the ESC sequence and its proponent
    Hugh Ross at Ferranti; and it made great sense to those of us designing
    computers using the technologies of the 1960's. And as computer
    applications started supporting text document input and output rather
    than Teletype I/O, ASCII became essential as the I/O code.

    "On March 11, 1968, U.S. President Lyndon B. Johnson mandated that all
    computers purchased by the United States federal government support
    ASCII, stating:

        I have also approved recommendations of the Secretary of Commerce
    regarding standards for recording the Standard Code for Information
    Interchange on magnetic tapes and paper tapes when they are used in
    computer operations. All computers and related equipment configurations
    brought into the Federal Government inventory on and after July 1, 1969,
    must have the capability to use the Standard Code for Information
    Interchange and the formats prescribed by the magnetic tape and paper
    tape standards when these media are used.[28]" from:
    http://en.wikipedia.org/wiki/ASCII

    I worked with 36 bit machines in the 60's (Univac 1107 & 1108) and the
    DEC PDP-10 family in the 70's: 7-bit ASCII was regarded as very
    efficient for storage back when RAM core memory cost >$1/bit (as
    contrasted with last week, when I paid $7 for 1 GB of 667 MHz SDRAM!).
    As a result, back in the late 60's we a bit appalled to see IBM move to
    the System/360 packing only 4 8-bit characters into 1 32-bit word.
    Little did we know how prescient that descision would be, with the
    advent of integrated circuits and Moore's Law.

    Clive P. Hohberger, PhD
    Consultant to
    Zebra Technologies Corporation
    333 Corporate Woods Parkway
    Vernon Hills, IL 60061-3109 USA
     
    - CONFIDENTIAL-
    This email and any files transmitted with it are confidential, and may also be legally privileged. If you are not the intended recipient, you may not review, use, copy, or distribute this message. If you receive this email in error, please notify the sender immediately by reply email and then delete this email.



    This archive was generated by hypermail 2.1.5 : Sat Apr 11 2009 - 22:14:25 CDT