From: Ruszlan Gaszanov (email@example.com)
Date: Sun Jan 21 2007 - 14:25:08 CST
> When I implemented collation, I needed to define code points for
> the various contractions that can occur. To avoid clashing with
> any private use code points, I chose to start allocating the con-
> tractions at 0x110000. This has worked quite nicely.
Obviously, there are ways to avoid using PUA code points for application-internal purposes if you make a point of it. But that's not my point...
The problem is that there is *nothing* in the current standard that says applications can't use any PUA code point as they please, so many applications use them for different things. Regardless of what you or I think of it, such use of PUA is already a well established praxis among major software vendors (like Microsoft, Apple, Adobe etc.). Some info on such uses can be found here:
So why can't the standard say to the developers:
"You want to use PUA for application-internal purposes, fine... but use specially designated ranges and don't mess with the rest." Then those who encode private characters in PUA would be officially advised to avoid those "dirty areas" lest they wish to risk compatibility issues on one hand, and guaranteed that no future standard-compliant application will mess with their PUA characters in the "clean areas" on the other hand.
Of course, the standard could prohibit using PUA for application purposes altogether, but that would oblige major software vendors to rewrite a good deal of their code - so there's a good chance they'll simply ignore such requirement. So, I think it would be much more sensible to put a sort of "warning sign" on the ranges where such use is already well-established and prescribe the developers to keep the rest of PUA "clean" if they want to claim standard compliance.
This archive was generated by hypermail 2.1.5 : Sun Jan 21 2007 - 14:28:52 CST