explain the alphanumeric coding - bcd to decimal


Explain the Alphanumeric Coding - bcd to decimal conversion?

For the inherently binary world of computer, it is necessary to put all symbols, letters, numbers, and so on into binary form. The most usually used alphanumeric code is the Extended Binary Coded Decimal Interchange Code (EBCDIC) and American National Standard Code for Information Exchange (ASCII). In every of these code the numbers 0-9, the entire character set and control characters are represented by 8-bit codes that is a single byte.

In Extended Binary Coded Decimal Interchange Code all eight bits are used in the character representation

  • Numbers 0-9 are represented by their 4-bit binary code preceded 1111, thus 5 is represented by 11110101
  • Characters are represented by separate eight bit codes e.g. A is 11000001


In American National Standard Code for Information Exchange only 7 of the bits are used in the character representation. The leftmost bit is reserved for a parity bit used in error detection.

  • Numbers are represented by their 4-bit binary code preceded by 011, thus 3 is represented by 0110011
  • characters again use all 7-bits e.g. a is 1100001

Request for Solution File

Ask an Expert for Answer!!
Electrical Engineering: explain the alphanumeric coding - bcd to decimal
Reference No:- TGS0304086

Expected delivery within 24 Hours