Re: Why is "endianness" relevant when storing data on disks but not when in memory?

From: Bjoern Hoehrmann <derhoermi_at_gmx.net>
Date: Sun, 06 Jan 2013 00:41:20 +0100

* Costello, Roger L. wrote:
>On page 62 it says:
>
> ... when we store ... data on disk, we write
> not 32-bit (or 16-bit) numbers but series of
> four (or two) bytes. And according to the
> type of processor (Intel or RISC), the most
> significant byte will be written either first
> (the "little-endian" system) or last (the
> "big-endian" system). Therefore we have
> both a UTF-32BE and a UTF-32LE, a UTF-16BE
> and a UTF-16LE.
>
>Then, on page 63 it says:
>
> ... UTF-16 or UTF-32 ... if we specify one of
> these, either we are in memory, in which case
> the issue of representation as a sequence of
> bytes does not arise, or we are using a method
> that enables us to detect the endianness of the
> document.
>
>When data is in memory isn't it important to know
>whether the most significant byte is first or last?

The idea is that this knowledge is implied because there is only a
single system with a single convention involved, with the assumption
that you do not look behind the curtain (do not access the "first"
byte of a multi-byte integer, for instance).

-- 
Björn Höhrmann · mailto:bjoern@hoehrmann.de · http://bjoern.hoehrmann.de
Am Badedeich 7 · Telefon: +49(0)160/4415681 · http://www.bjoernsworld.de
25899 Dagebüll · PGP Pub. KeyID: 0xA4357E78 · http://www.websitedev.de/ 
Received on Sat Jan 05 2013 - 17:46:33 CST

This archive was generated by hypermail 2.2.0 : Sat Jan 05 2013 - 17:46:34 CST