answersLogoWhite

0


Best Answer

What encoding scheme is used by the 802.11a and 802.11g standards but not by the 802.11b standard?

User Avatar

Wiki User

6y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What encoding scheme is used by the 802.11a and 802.11g?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

What is that EBCDIC?

EBCDIC is Extended Binary Coded Decimal Interchange Code. It was the character encoding scheme developed and used by IBM. EBCDIC is completely overshadowed by ASCII and ASCII's big brother, Unicode. EBCDIC is very difficult to use, as the alphabet is non-contiguous and the encoding makes no logical sense.


What is ASC II?

ASCII (American Standard Code for Information Interchange) is a type of computer encoding scheme used for American English characters. It uses code to tell the keyboard which key corresponds with each input.


What is the base of a character?

In computer science, the "base" of a character typically refers to its character encoding, which defines a mapping between characters and numeric values (often in binary form) for representation in digital systems. The base can vary depending on the encoding scheme used, such as ASCII, Unicode, or UTF-8, determining how characters are stored and interpreted by computers.


Which circuit is used in keyboards?

Matrix encoding circuit.


Why DSP processor is used in cell phones?

You need DSP processors to encode/decode the signals, performs channel encoding and source encoding.


Why is the algebra the language of technology?

beacause it is used by encoding your documents


What is a character encoding standard?

Character encoding is the way that a computer interprets and then displays a file as text. Each encoding has its own set of characters that it can match to the file. For example, the Windows-1252 encoding, used for Western European languages, contains characters like accented vowels that are used in Spanish, French, etc. However, an encoding used for Russian family languages would include characters from the Cyrillic alphabet. Most encodings use 8 bits to encode a single character, which allows the encoding to contain up to 256 characters. Unicode is a newer encoding system that uses a significantly different system for character encoding that allows it to surpass the 256 character limit. Over 100,000 characters are currently supported by Unicode/UTF-8.


What are the three processes used by sender-receivers?

* Sensory perception * Encoding * Transmitting


What scheme is the most widely used coding scheme to represent data that is used by most personal computers and servers?

American Standard Code for Information Interchange ASCII is the most widely used coding scheme.


How ASCII is used in the computer?

ASCII is a form of character encoding, so it can be used by your computer for any task.


Except ASCII and EMBDIC are there other popular coding schemes used for the Internet application?

ASCII is very common. EMBDIC is hardly so. However, ASCII has been almost completely replaced by Unicode, which is by far the most common encoding scheme anywhere. Unicode comes with several variations (UTF-8, UTF-16, UTF-32, etc). UTF-8 is an 8-bit extension of the 7-bit ASCII coding scheme and allows the encoding of any arbitrary character available in Unicode. The different formats UTF-16 and on are primarily used to encode characters in a different language that will almost always require subsequent bytes.


What is 7-bit code?

7-bit code is an encoding scheme that uses just 7 bits. 7-bit encoding allows positive values in the range 0 through 127. ANSI is a 7-bit encoding scheme. When used on an 8-bit system, bit 7 is always zero (where bit 0 is always the least significant bit). Bit 7 is used by the ASCII extended character set, where the first 127 characters are the same as those defined by the ANSI character set. 7-bit encoding was often used in early tele-printing where bit 7 was not needed. In this way, they could encode eight 7-bit characters into a 56-bit package which could be transmitted and decoded by the 7-bit teleprinter -- the idea being that the fewer bits you transmit, the quicker it will be to send the data across a telephone line.