Quantcast
Channel: User Filip Milovanović - Software Engineering Stack Exchange
Viewing all articles
Browse latest Browse all 163

Comment by Filip Milovanović on When and how endianness naming was defined?

$
0
0
Look at an ordinary decimal number, like 1234. There, in terms of what contributes to the value, '1' is the most significant digit (because it represents thousands, and if you changed it, it would affect the value the most). It's the "big end" of the number. Similarly '4' is the least significant digit (it represents ones, and if you changed it, it would affect the value the least). It's the "little end". You can transfer the same concept to a sequence of bytes. In decimal numbers, each digit can be one of 9 values. In byte sequences, each "digit" can be one of 256 values.

Viewing all articles
Browse latest Browse all 163

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>