Michael Everson writes (and kenw responds):
> >Complex for WHAT? What are you trying to do?
> Sorting and double-click word selection, for instance. Global recognition
> for find-and-replace.
> >Iteration is simply
> >iteration... Computers do it better than we do, so what if you have to keep
> >iterating on something? Infinite lookahead is not "difficult" and it doesn=
> >make anything more complex; it just takes the machine more iterations...
> No loss of processing efficiency?
If what you are doing is double-click word selection, then the extra
iterations are completely irrelevant. Compared to the display operations
to do the actual highlighting, it is hardly measurable and no impact on
"perceived performance", which is all that matters.
For collation and text matching (as required for find-and-or-replace),
lookahead processing is required, whether one uses combining marks or
not. Perceived performance is a result of just how clever the algorithms
are, whether they depend on cached data or always recalculate anew.
And if one is using combining marks, and IF one does not normalize
the text before doing the text operation which requires the lookahead,
then performance could be stochastically impacted by the number of
precomposed to decomposed match comparisons had to be made. But it
would be a sorry engineer who just sat there and moaned about that.
Text normalization on input is just one of the potential bag of
tricks that can be brought to bear.
This archive was generated by hypermail 2.1.2 : Tue Jul 10 2001 - 17:20:31 EDT