answersLogoWhite

0


Best Answer

ASCII character array (including null-terminator):

{'N','e','t','w','o','r','k','\0'}

ASCII character codes (decimal):

{78,101,116,119,111,114,107,0}

ASCII character codes (octal):

{4,7,1,4,5,3,5,0,7,3,5,5,7,3,4,4,6,5,4,0,0}

ASCII character codes (hexadecimal):

{4E,65,74,77,6F,72,6B,00}

ASCII character codes (binary):

{01001110,01100101,01110100,01110111,01101111,01110010,01101011,00000000}

When treated as a 64-bit value, the ASCII-encoded word "Network" has the decimal value 5,649,049,363,925,854,976.

User Avatar

Wiki User

9y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What is the binary equivalent of the word Network using ASCII encoding?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Engineering

How do you say Hi in binary code?

That depends on your string encoding. In ascii, for example: H = 72 = 1001000 i = 105 = 1101001


What is the difference between binary code and extendible binary code?

There is no such thing as extendible (sic) binary code. However, there are two known variants: eXtendable Binary (XB) is a universal file format used for serialising binary trees. Extended Binary Coded Decimal Interchange Code (EBCDIC) was an 8-bit character encoding used by IBM in the 1960's. It's a non-standard encoding that was used by IBM prior to them switching to ASCII peripherals.


What is the difference between ascii and ebcdic?

Due to the advancement of technology and our use of computers, the importance of ASCII and EBCDIC have all but ebbed. Both were important in the process of language encoding, however ASCII used 7 bits to encode characters before being extended where EBCDIC used 8 bits for that same process. ASCII has more characters than its counterpart and its ordering of letters is linear. EBCDIC is not. There are different versions of ASCII and despite this, most are compatible to one another; due to IBMs exclusive monopolization of EBCDIC, this encoding cannot meet the standards of modern day encoding schemes, like Unicode.


What is the binary code for B?

That depends what you mean by "B", and what you mean by "binary code" assuming that by "binary code", you actually mean a binary representation of it's ascii value, then the answer is 1000010. The ascii value of the character "B" is 66 in decimal, which is 1000010 is that value in binary. If on the other hand, you mean "what is the binary value of the hexidecimal number B?", then the answer is 1011.


Why is 01100001 A in binary code?

0110 0001 bin -> 61 Ascii -> a

Related questions

'A' is represented in the binary system as?

Most computers use ASCII (or some similar) encoding, in which 'A' is represented as 65, or 01000001 binary. Older IBM mainframes use an entirely different encoding.


What are the similarities between ascii-7 and ascii-8?

First of all ASCII is encoding system that tells how binary data from file could be represented as text. Is was and still is very widely used starting 1960s. Standard ASCII encoding is 7-bits encoding allowing 128 values, while Extended ASCII is 8-bits encoding which allows 256 values, that is 128 more characters in the table. First 128 Extended ASCII table characters is the same as ASCII table, next 128 is additional characters.


What Is a standard for encoding and interpreting binary files images video and non-ASCII character sets within an email message?

MIME (Multipurpose Internet Mail Extensions) in 1992. MIME is a standard for encoding and interpreting binary files, images, video, and non-ASCII character sets within an e-mail message


Why would you use ASCII instead of Binary?

ASCII = American Standard Code for Information InterchangeThat means that ASCII is a type of character encoding...Unless you want to write in 1's and 0's, then you must use ASCII. If you type a single character, it's most likely ASCII. To show you how ridiculous typing in binary is:011101110110100101101011011010010010000001100001011011100111001101110111011001010111001001110011 = wiki answers (lowercase)


How do you say Hi in binary code?

That depends on your string encoding. In ascii, for example: H = 72 = 1001000 i = 105 = 1101001


How does a computers process binary digits into keyboard characters?

Chracters are represented using binary digit combinations. For example the ASCII American Standard Code for Information Interchange is one such encoding.


What is the ascii code in binary and decimal for a period?

In hexadecimal, that would be 0x2E, which is equivalent to 46 in decimal, which in binary is 101110.


What is that EBCDIC?

EBCDIC is Extended Binary Coded Decimal Interchange Code. It was the character encoding scheme developed and used by IBM. EBCDIC is completely overshadowed by ASCII and ASCII's big brother, Unicode. EBCDIC is very difficult to use, as the alphabet is non-contiguous and the encoding makes no logical sense.


What is the binary equivalent of the word NETWORK?

Ascii table 78 + 69 + 84 + 87 + 79 + 82 + 75 1001110 1000101 1010100 1010111 1001111 1010010 1001011


What is the binary code for mike?

01101101011010010110101101100101 Assuming you are using ASCII encoding, using 8-bit characters on a Big Endian computer architecture, then it would be: 01101101 01101001 01101011 01100101 On a computer using Little Endian byte-order: 01100101 01101011 01101001 01101101 The binary representation of a character is heavily dependent on the particular character encoding method being used (ASCII was originally the most common encoding, with UTF-8 now superseding it)


What is a method of translating data into code?

One method of translating data into code is by using encoding techniques. Encoding is the process of transforming data into a format that can be easily processed or transmitted by a computer. Common encoding methods include binary encoding, ASCII encoding, and Unicode encoding. These methods assign numeric values or patterns to represent the data, allowing it to be stored or transmitted as code.


What is the meaninig of ASCII?

American Standard Code for Information Interchange. ASCII is a form of character encoding.