Fundamental idea: the most basic unit of data is the
byte -- virtually all computers (and network systems)
handle data one byte at a time. Recall that a byte is an 8-bit
value and thus can take any value between zero and
255decimal.
US-ASCII (or, just "ASCII") was the first widely accepted data
representation system, and is universally recognised. In its
traditional form, it's a 7-bit code, meaning that if an ASCII
message is stored or carried in a modern byte-oriented system, the
Most Significant Bit (MSB) of every byte will
always be zero. For this reason, ASCII messages are sometimes
called "7-bit data". An ASCII-valued byte has traditionally been
called a "character", and obviously takes any value between zero
and 127.
Within the ASCII "character set" there is a further
subdivision:
Printable ASCII
characters with values between 32 (the ASCII "space" character)
and 127 (the "DEL" character). This includes all of the uppercase
and lowercase letters, the digits and the punctuation
characters.
Control Characters
character values between zero and 31. These were originally
designed for a range of "official functions", most of which are now
irrelevant.