Asmus, et al,
I've seen this topic discussed over and over again, even as I worked on a
remarkably multilingual product which very few folks bought (Xerox's
The bottom line was then and is now, how much are folks willing to pay for
multilingual capability, and how many folks are willing to pay it? Software
companies are for-profit organizations. Multilingual support is not trivial.
It costs a tremendous amount of money to garner the expertise, evaluate the
product(s), design, and code, for multilingual.
I'm not saying we're not moving in that direction - we certainly are here at Sun
and the Alliance, and I know other companies are. The question is, how fast?
Unicode is a big help in the area of multilingual support, but it is only a
piece of it.
My observation is that there are not enough folks willing to pay the price
_just_ for multilingual to warrant an all-out effort towards implementing it
faster. I think many people would like the capability, and would use it if they
had it, but when you tell them what it would cost and what other new features
they'll have to give up if they want it right away, they would gladly stick with
I still believe that the vast majority of multlingual need is available within
the confines of the available charset and data processing. Of all the int'l
feedback (read: complaints) I have heard about the products I have worked on,
multilingual just isn't one of them.
We only have so much time/money/resource to work with. Multilingual is a good
goal, let's keep working towards it. But first there are some more pressing
i18n issues to be resolved.
Sun-Netscape Alliance i18n architect
Asmus Freytag wrote:
> 1) My utility bills in Seattle are printed in these languages
> - English, Spanish, Vietnamese, Chinese, Korean, Lao and Thai.
> (on the same bill).
> 2) Some of the foods (and other goods) I buy come with packages that contain
> - English, European languages, even Arabic
> (These are not necessarily 'ethnic' foods, but the packages are
> intended for export. Extreme combinations of languages are more
> common in some markets, those that get the 'Rest of World' package
> as seen from the perspective of the producer)
> 3) Instruction and safety booklets come with almost any juxtaposition of
> 4) Most of my (European) newspapers easily cross alphabet boundaries
> (e.g. use of correct Latin-2 accents is common in Latin-1 languages).
> Somebody has to produce all of these. For that purpose it would be enough to
> have the translators use special purpose software. But just as with the
> problems of German e-mail in a mixed 6/7/8 bit e-mail infrastructure, this
> scenario runs into problems as one finds oneself reduced to manipulate
> pictures of translations in the rest of the production process. And that is
> how many of these things are done.
> I appreciate Chris' attempt to not overstate the case for multilingual
> support, but as we become more dependent on the web and net infrastructure
> to handle all our text processing, the remaining bottlenecks do tend to
> have a 'reverse synergy' type cost to them. My theory is that these
> bottlenecks prevent some users from fully adopting the new technologies.
> This 'cost' probably scales more with the percentage of users who have to
> waorry about identifying and managing work arounds for them, even if
> infrequently, rather than merely with the straight percentage of documents.
This archive was generated by hypermail 2.1.2 : Tue Jul 10 2001 - 17:20:56 EDT