Re: The Unicode Standard and ISO

From: Marcel Schneider via Unicode <unicode_at_unicode.org>
Date: Sun, 10 Jun 2018 08:35:39 +0200 (CEST)

On Sat, 9 Jun 2018 12:56:28 -0700, Asmus Freytag via Unicode wrote:
[…]
> It's pushing this kind of impractical scheme that gives standardizers a bad name.
>
> Especially if it is immediately tied to governmental procurement, forcing people to adopt it (or live with it)
> whether it provides any actual benefit.

Or not. What I left untold is that governmental action does effectively work in both directions (examples following),
but governments don’t own that lien of ambivalence out of unbalanced discretion. When the French NB positioned 
against encoding Œœ in ISO/IEC 8859-1:1986, it wasn’t the government but a manufacturer who wanted to get
around adding support for this letter in printers. It’s not fully clear to me why the same happened to Dutch IJij.
Anyway as a result we had (and legacy doing the rest, still have) two digitally malfunctioning languages.
Thanks to the work of Hugh McGregor Ross, Peter Fenwick, Bernard Marti and Luek Zeckendorf (ISO/IEC 6937:1983),
and from 1987 on thanks to the work of Joe Becker, Lee Collins and Mark Davis from Apple and Xerox, things started
working fine, and do work the longer the better thanks to Mark Davis’ on-going commitment.

Industrial and governmental action both are ambivalent by nature simply because human action may happen to be
short-viewed or far-sighted for a variety of reasons. When the French NB issued a QWERTY keyboard standard in 1973
and revised it in 1976, there were short-viewed industrial interests rather than governmental procurement. End-users
never adopted it, there was no market, and it has recently been withdrawn. When governmental action, hard scientific
work, human genius and an up-starting industrialization brought into existence a working keyboard for French that is
usefully transposable to many other locales as well, it was enthousiastically adopted by the end-users and everybody
urged the NB to standardize it. But the industry first asked for an international keyboard standard as a precondition…
(which ended up being an excellent idea as well). The rest of the story may be spared as the conclusion is already clear.

There is one impractical scheme that bothers me, and that is that we have two hyphens because the ASCII hyphen was
duplicated as U+2010. Now since font designers (e.g. Lucida Sans Unicode) took the hyphen conundrum seriously to
avoid spoofing, or for whatever reason, we’re supposed to have keyboard layouts with two hyphens, both being Gc=Pd.
That is where the related ISO WG2 could have been useful by positioning against U+2010, because disambiguating the
the minus sign U+2212 and keeping the hyphen-minus U+002D in use like e.g. the period would have been sufficient.

On the other hand, it is entirely Unicode’s merit that we have two curly apostrophes, one that doesn’t break hashtags
(U+02BC, Gc=Lm), and one that does (U+2019, Gc=Pf), as has been shared on this List (thanks to André Schappo).
But despite a language being in a position to make a distinct use of each one of them, depending on whether the
apostrophe helps denote a particular sound or marks an elision (and despite of having already a physical keyboard and
driver that would make distinct entry very easy and straightforward), submitting feedback didn’t help to raise concern
so far. This is an example how the industry and the governments united in the Unicode Consortium are saving end-users
lots of trouble.

Thank you.

Marcel
Received on Sun Jun 10 2018 - 01:36:08 CDT

This archive was generated by hypermail 2.2.0 : Sun Jun 10 2018 - 01:36:09 CDT