From: John Hudson (tiro@tiro.com)
Date: Thu Mar 10 2005 - 12:39:19 CST
Dean Snyder wrote:
> I've made it very clear that THE basis for my thinking on encoding damage
> indicators is to enable "guaranteed" integrity for damaged, interchanged
> plain text.
It seems to me that this is a specialised usage for text -- one in which, obviously, I
have some interest -- and however one indicates damage, ambiguity or ommission in
electronic transcription of manuscripts one will likely be working with such text in
specialised software or customisations (macros, scripts, etc.). I've looked at scholarly
printed works, from the past 200+ years, that deal with such texts and have seen a variety
of conventions to indicate that text is dama[g]ed, ambig{u/v}os, (.. .)issing. Conventions
may be particular to scholarly printing in a particular country, a particular publishing
house, or even a particular author. It seems to me that standardisation of the way in
which such artefacts of text are indicated in print would be a good first step toward
determining how best to encode them, because if one had a discreet set of standardised
sigils (whether drawn from existing characters or new), one could design one's tools to
ignore or interpret these sigils as desirable for a particular function (e.g. ignore them
when performing text comparisons, word searches, etc.; display them when printing; and so on).
John Hudson
-- Tiro Typeworks www.tiro.com Vancouver, BC tiro@tiro.com Currently reading: A century of philosophy, by Hans Georg Gadamer David Jones: artist and poet, ed. Paul Hills
This archive was generated by hypermail 2.1.5 : Thu Mar 10 2005 - 12:40:32 CST