Byte

unit of digital information equal to 8 bits From Wikipedia, the free encyclopedia

Remove ads

The byte (symbol: B or o, also known as the octet) is a unit of measurement of the size of information on a computer or other electronic device. A single byte is eight bits. Some early computers used six bits for each byte. The bit is the smallest unit of storage on a computer, a single on/off value.

A single typed ASCII character (for example, 'x' or '8') is stored in one byte. The character is held as a binary number which encodes a text character. To map each number to a character, an agreed code such as EBCDIC or ASCII is needed. EBCDIC is a character encoding used mainly on mainframe computers. It uses 8 bits per byte. ASCII is another encoding that only uses seven bits. Extended ASCII uses 8 bits to give more types of characters, mostly used on personal computers.

The byte is the smallest useful unit of measure to show how many characters a computer (or electronics device) can hold. This is useful for things like RAM, or storage devices like USB drives and other types of Flash memory. Sending of data (for a modem or wi-fi) is usually measured in bits, not bytes.

On modern computers, one byte is equal to eight bits. Some early computers used fewer bits for each byte. To tell them apart, computer scientists called an 8 bit byte an octet. In modern usage, an octet and a byte are the same.

Remove ads

Symbol

The symbol for the byte is B. Sometimes a lowercase "b" is used, but this use is incorrect because "b" is actually the IEEE symbol for the bit. The IEC symbol for bit is bit. For example, "MB" means "megabyte" and "Mbit" means "megabit". The difference is important because 1 megabyte (MB) is 1,000,000 bytes, and 1 megabit (Mbit) is 1,000,000 bits or 125,000 bytes. It's easy to confuse the two, but bits are much smaller than bytes, so the symbol "b" should be used when referring to "bits" and an uppercase "B" when referring to "bytes".

Remove ads

Names for larger units

For large data, byte is often used with a binary prefix:

The following terms represent even larger units of bytes, but are very rarely used:

  • Exabyte (EB) (1018)/exbibyte (EiB) (260)
  • Zettabyte (ZB) (1021)/Zebibyte (ZiB) (270)
  • Yottabyte (YB) (1024)/Yobibyte (YiB) (280)
  • Ronnabyte (RB) (1027)/Robibyte (RiB) (290)
  • Quettabyte (QB) (1030)/Quebibyte (QiB) (2100)
Remove ads

Byte Chart

According to the International Electrotechnical Commission (IEC), who sets many computer standards, these charts show how bytes should be referred to.

People who refer to 1 kilobyte as 1,024 bytes, for example, are technically incorrect; 1,024 bytes should be referred to as 1 kibibyte, according to the IEC. [1] However, using 1024 for kilo and 1048576 for mega, etc. was widely practiced before the IEC standards were set in 1998. There is some confusion and mixing of terms in the marketplace. Computer memory is still referred to in powers of 2, so 1KB of memory is 1024 bytes, whereas in computer data storage powers of 10 are used, so 1KB is 1000 bytes.

"kilo-" = 1,000

When using standard metric names like "kilo-", "mega-" and "giga-", they should follow the same measure that other metric measurements use, like kilometer (1 kilometer = 1,000 meters), or gigahertz (1 gigahertz = 1,000,000,000 hertz) for example.

More information Unit, Number ...

"kibi-" = 1,024

Since computers are very complex digital devices that are based on the binary numeral system rather than the commonly-used decimal numeral system or binary coded decimal system, there are many situations where the standard metric system does not work well, particularly with memory sizes for a computer or storage device. If a memory or storage device uses a binary number for addresses, the number of different positions to be accessed (the size of the memory) can be expressed as a power of 2, rather than a power of 10.

More information Unit, Number ...
Remove ads

References

Loading related searches...

Wikiwand - on

Seamless Wikipedia browsing. On steroids.

Remove ads