how many bits are used to show unicode ascii


How many bits are used to show Unicode, ASCII, UTF-16, and UTF-8 characters?

Unicode requires 16 bits and ASCII require 7 bits although the ASCII character set uses only 7 bits, it is usually represented as 8 bits.

UTF-8 shows characters using 8, 16, and 18 bit patterns.

UTF-16 uses 16-bit and larger bit patterns.

 

Request for Solution File

Ask an Expert for Answer!!
JAVA Programming: how many bits are used to show unicode ascii
Reference No:- TGS0310929

Expected delivery within 24 Hours