Look at an ordinary decimal number, like 1234. There, in terms of what contributes to the value, '1' is the most significant digit (because it represents thousands, and if you changed it, it would affect the value the most). It's the "big end" of the number. Similarly '4' is the least significant digit (it represents ones, and if you changed it, it would affect the value the least). It's the "little end". You can transfer the same concept to a sequence of bytes. In decimal numbers, each digit can be one of 9 values. In byte sequences, each "digit" can be one of 256 values.
↧