answersLogoWhite

0

Characters are first given an internationally agreed decimal value. The decimal value is converted to binary by the computer. For example... the decimal value for the letter A is 65... this converts to binary as 1000001

User Avatar

Wiki User

8y ago

What else can I help you with?

Related Questions

What are bytes used to represent?

bytes are used to represent the amount of capacity in a memory


1 What is a Byte?

The combination of bits used to represent a particular letter number or character. e.g.: data bytes,


How many bytes are used to represent one character?

In the UTF 8 standard of representing text, which is the most commonly used has a varying amount of bytes to represent characters. The Latin alphabet and numbers as well as commonly used characters such as (but not limited to) <, >, -, /, \, $ , !, %, @, &, ^, (, ), and *. Characters after that, however, such as accented characters and different language scripts are usually represented as 2 bytes. The most a character can use is 4, I think (Can someone verify? I can't seem to find the answer).


Is unicode uses two bytes for each character rather than one byte?

Unicode can use varying byte lengths to represent characters, depending on the encoding system employed. For example, UTF-8 uses one to four bytes per character, while UTF-16 typically uses two bytes for most common characters but can use four bytes for less common ones. Therefore, it is not accurate to say that Unicode universally uses two bytes for each character; it depends on the specific encoding used.


Four bytes can represtent a decimal number between 0 and?

Four bytes represent 32 bits. 32 bits represent 4,294,967,296 possibilities.


What is term bytes?

Bytes are units of digital information that are commonly used to measure the size of files or the amount of data stored in a computer. One byte is equal to 8 bits, and it is often used to represent a single character of text. Bytes are a fundamental building block for storing and processing data in computers.


Character use only one byte why Java use two byte for character?

The number of bytes used by a character varies from language to language. Java uses a 16-bit (two-byte) character so that it can represent many non-Latin characters in the Unicode character set.


How many bytes for unsigned character?

One.


1 word is equal to how many bytes?

every character consumes 2 bytes. so if your word has 4 characters then it will consume 8 bytes.


What is the storage space required for a character?

1 bytes


What does Mega Byte mean in the computer?

A megabyte is one million bytes, each byte being a sequence of 8 bits, which is enough information to represent one character of alphanumeric data.


How do computers represent data needed to solve problems?

Primarily bytes.