Glossary

Byte

« Back to Glossary Index

byteA byte is a unit of digital information which is equal to eight bits. An uppercase letter “B” is used to abbreviate it. It is used to represent characters in a computer i.e. letters, numbers, or symbols.

The use of the term byte began in 1956. Dr. Werner Buchholz coined the term while working on early IBM computer technology. At the time the industry was working on how to translate all alphanumeric characters as well as mathematical notation and other symbols into something the computers of the day could understand. To input information into a computer paper punch cards or tape would be used. Depending on the kind of computer the size of the tape, shape of the holes and the number of holes to punch varied. Together with Bob Bemer, Dr. Buchholz defined the 8-bit character system that would become the byte and later part of the ASCII standard. ASCII defines how different characters are represented in binary. Other character standards exist as well with different character sets and numbering systems though ASCII is the most widely used.

In addition to defining a single character, the term is used in computer RAM and hard disk capacities. Large quantities of memory are expressed in terms of: kilobyte or KB (1024 bytes); megabyte or MB (1,048,576 bytes); gigabyte or GB (1,073,741,824 bytes); or terabyte or TB (1,099,511,627,776 bytes). As computing power and capacity increases ever larger units of measure are coming into use. Indeed new measures have to be invented to match the enormous amounts of storage predicted for the future.

The terms bit and byte are often confused. This is why Dr. Buchholz referred to a bite as a byte to avoid accidental translation to bit (by inadvertently dropping the “e”). Be aware that in streaming technology it is most commonly bits we refer to when looking at transmission speeds – i.e. kilobits per second is the most common measure.

« Back to Glossary Index
nsr

Sign Up For Free Account