CS146 - 01-22-2026

More on ASCII etc...

Recall: Characters - each character needs a numeric code that represents it.

  • 7 bits are used so that the last 1 bit can be used for other things, like error thingies

    • The first bit is usually 0, allowing 2^7 possibilities

    • What if the first bit was 1?

      • Thus, We Added new symbols, and shit

        • But this STILL isn't enough to represent every kind of character !

        • Solution: A new encoding, called Unicode

          • 10 Million possibilities.

ASCII Code:

// Some code

'\n' = 10
'\space' = 32
'0' - '9' = 48 - 57
'A' - 'Z' = 65 -90
'a' - 'z' = 97 - 122
 

EBCDIC: Extended Binary Coded Decimal Interchange Code

  • Had gaps, and thus was not preferred to be used.

utf8, variable length type of encoding, used to transfer from ascii to unicode?


Convert a digit char c to its numeric value: c0c- '0'

Last updated