I'm working through The Elements of Computing Systems when I read the following excerpt:
The Hack computer includes a black-and-white screen organized as 256 rows of 512 pixels per row. The screen's contents are represented by an 8K memory map that starts at RAM address 16384 (0x400). Each row in the physical screen, starting at the screen's top left corner, is represented in RAM by 32 consecutive 16-bit words. Thus the pixel at row r from the top and column c from the left is mapped on the c%16 bit (counting from LSB to MSB) of the word located at RAM[16384 + r * 32 + c%16]. To write or read a pixel of the physical screen, one reads or writes the corresponding bit in the RAM-resident memory map (1 = black, 0 = white).
So, if the screen is 256 rows of 512 pixels, and each pixel is a single bit, how is that an 8K memory map for the whole screen?
256 rows * 512 bits = 131072 / 8 bits per byte / 1024 bytes per K = 16K
Wouldn't that be a 16K memory map?
The only thing I can think of is that because the word size is 16 bits, maybe this plays a factor? I have always known "byte" to mean 8 bits, but if its definition is dependent on the word size of the computer, maybe that would solve this mystery for me. Can someone explain to me how the screen described in that paragraph is represented with an 8K memory map and not 16K?
Yes, a byte is always 8 bits in modern computing.
The book uses Words, not bytes
In the book, the word and the size of the word is explicitly mentioned, while there is not a word (haha) about bytes. Look at the phrase ..is represented in RAM by 32 consecutive 16-bit words.
. The whole size is expressed in (16 bit) words rather than bytes.
Therefore, 8K refers to 8 Kilowords. 8 kilobytes would formally be written as 8KB, if that notation is used at all in this book.
Words are quite important when it comes to processor architecture. Words in programming languages are usually 2 bytes (or 16 bits), but in processor architecture they can be 8 or 32 bits as well, and they refer to the natural size of the data units with which a processor works, so it makes sense that the book uses words rather than bytes, since the text seems very hardware oriented.
To see how bytes and words are related, please read this answer.
Different byte sizes
Wikipedia describes how a byte was originally (1960s) based on the size of information it needed to hold, so a 6 bit byte could be used for (English) characters, while bytes of other sizes would be used for different number formats. It started out as 6 bits for English characters, grew to 7 bits to support ASCII, and eventually the popularity of the 8 bit IBM System/360 caused the global acceptance of 8 bit bytes.
A byte is a software unit representing an amount of data, while a word is more tied to the processor/memory architecture and represents work units (registers) in the processor and the addressable units in memory. Nowadays though, a byte is always considered to be 8 bits, and words are a multiple of that. There are still processors around that have a different word size, but those are special purpose processors. Normal hardware, from PCs to phones and game consoles follows the standard.