Re: Too narrowly defined: DIVISION SIGN & COLON

From: Leif Halvard Silli <>
Date: Mon, 9 Jul 2012 22:08:33 +0200

Jukka K. Korpela, Mon, 09 Jul 2012 15:14:56 +0300:
> 2012-07-09 11:39, Leif Halvard Silli wrote:

> In practice, it’s always a symbol of division in calculators.

It wasn't always like that. Take the Danish Contex calculators:

* Contex mechanical calculator from the 1960-ies
  using 'Div' instead of a mathematical symbol.
* Contex electronic calculator, 1973, with COLON instead of DIVISION:

Modern wise, it is usually no problem. But not long ago I got
completely lost - I think it was an (web) app - probably the programmer
was unaware of the conventions:

1) Coloring the minus key in red
2) Placing minus close to plus, and division away from minus.
3) Using "/" instead of "÷", both for Norwegian calculators
   <>, and for US English (Apple) keyboards

>>> Adding new characters would be possible in principle, but hardly
>>> realistic or useful in this case. They would not change the bulk of
>>> existing data that uses existing characters, and they would just add
>>> to the confusion rather than remove it.
>> Could not 'DIVISION SLASH' have been dismissed by the same argument?
> Back in the early 1990s (DIVISION SLASH was adopted into Unicode in
> version 1.1 in 1993), it might have been possible to present such an
> argument against it. The database entry
> says that
> DIVISION SLASH is/was present in “VENTURA_SYM” encoding (“charset”),
> and presence in an existing encoding was surely a strong argument in
> favor of accepting a character. Moreover, DIVISION SLASH is not just
> SOLIDUS with more exact semantics; the characters are typically
> clearly different, DIVISION SLASH being much more slanted.

'Everson Mono' has them exactly identical, which seems typical for
monospaced fonts. (Perhaps this is 'wrong' of the font designers - to
make it look like SOLIDUS?) In 'Lucida Grande', the DIVISION SLASH is
slightly longer and higher, but with the same slant. But in 'Times New
Roman', the DIVISION SLASH is, as you say, more slanted (and longer
plus higher).

In contrast, the "division-minus" would not have varied with the style
of the font ... It would always have had the two dots.

> But this precedent demonstrates that narrow semantics does not make
> characters popular.

The notion 'popular' about characters is problematic. As I have
explained, the DIVISON ÷ is not popular amongst Norwegian typists. Hah,
I don't know how to type it - I only looked it up right now (Alt-Z on
my Mac). I think we must talk about the context (language, typing,
display, application type etc) when we evaluate the popularity of

> Most people and documents use “/” for division
> (and this is supported by a normative rule in ISO 80000-2), without
> ever considering the possibility of using DIVISION SLASH or even
> knowing about it at all. Yet, this character has existed in Unicode
> for almost 20 years.

The SOLIDUS "/" for division *might* be part of the "Anglo-fication of
the World". It is also one thing to press that symbol on a calculator,
and another thing to use it in text. In 'professional' math, typed with
LaTeX et cetera, one probably uses the 'real' DIVISION symbol, since
LaTeX doesn't require you to know how to type it on the keyboard.

I forgot to speak about the multiplication symbol. The common
multiplication symbol in Norwegian schools looks like the MIDDLE DOT.
And I see that in LaTex, it is recommended to not type a multiplication
symbol (!) or to use the \cdot (thus: MIDDLE DOT) or the \times (aka:

Google includes a calculator in the search field. For multiplication,
it allows me to use the MULTIPLICATION X and the ASTERIX. But it does
not allow me to use MIDDLE DOT. Likewise, for division, it allows me to
subtraction, it allows me to use HYPHEN-MINUS. But not MINUS SIGN.
Which means that if I paste "−6" from my Apple, then
Google won't understand it.

One conclusion to draw is that in math, there is often a gap between
typing and display and between handwriting and computer.

> Semantic disambiguation just doesn’t work, as a rule. Far from being
> “the” division slash, DIVISION SLASH exists in Unicode for use when
> you wish to use it. If you ask me, it could be used for clarity, in
> situations where this matters and where you can know for sure that
> the font(s) being used contain the character.

It is quite practical to use semantic symbols when one work with the

>> I'd say that the purpose should be to take
>> the consequence of a realization that it is a independent character.
> But that’s a fairly theoretical, even ideological purpose.

Even when on is only descriptive, one is so for some purpose …

>> (But I guess, as well, that it would be legitimate, for a font
>> designer, to make a 'MINUS' which was shaped as a DIVISION MINUS?)
> No, because that would distort the identity of the character. It is
> an error to make a character intentionally look like another
> character. But it’s not a punishable crime, and font designers make
> such mistakes.

Yeah, we don't want to make Latin letters look like Lanka letters …

But in this case the idea is not to make a MINUS SIGN that looks like a
DIVISION SIGN. The idea is, instead, to distinguish the MINUS SIGN from
a HYPHEN-MINUS, by making it look like the minus sign that your and I
know from our own cultures. Because, that is what it is: A minus sign.
E.g. when I see ÷50%, then I don't consider that I look at a division
sign. I see something that semantically means 'minus'.

So where is the wrong in making the MINUS SIGN look like the, to be
codified, DOUBLE DOTTED MINUS SIGN? At any rate, until it eventually
gets specified that DIVISION SIGN can also mean minus, it seems just as
wrong to type DIVISION SIGN in order to get the - to be codified -

I think that the mathematical signs of Unicode are meant to be
unambiguous - it would be bad, for instance, if the MULTIPLICATION X
could also be taken to mean plus (+), no?

>> But before landing on that conclusion, I would like to point out that
>> if one added new characters, then one would get annotation, _as well_
> It would create separate entries, but this does not imply any
> annotations by default. The annotations are there because decisions
> were made to include them.

OK. But in this case, I think one should decide to add them ...

Leif Halvard Silli
Received on Mon Jul 09 2012 - 15:12:09 CDT

This archive was generated by hypermail 2.2.0 : Mon Jul 09 2012 - 15:12:33 CDT