![]() In the earliest non-electronic information processing devices, such as Jacquard's loom or Babbage's Analytical Engine, a bit was often stored as the position of a mechanical lever or gear, or the presence or absence of a hole at a specific point of a paper card or tape. Data transfer rates are usually measured in decimal SI multiples of the unit bit per second (bit/s), such as kbit/s. A bitwise operation optionally processes bits one at a time. ![]() For example, in transistor–transistor logic (TTL) and compatible circuits, digit values 0 and 1 at the output of a device are represented by no higher than 0.4 volts and no lower than 2.6 volts, respectively while TTL inputs are specified to recognize 0.8 volts or below as 0 and 2.2 volts or above as 1.īits are transmitted one at a time in serial transmission, and by a multiple number of bits in parallel transmission. Different logic families require different voltages, and variations are allowed to account for component aging and noise immunity. In most modern computing devices, a bit is usually represented by an electrical voltage or current pulse, or by the electrical state of a flip-flop circuit.įor devices using positive logic, a digit value of 1 (or a logical value of true) is represented by a more positive voltage relative to the representation of 0. These may be the two stable states of a flip-flop, two positions of an electrical switch, two distinct voltage or current levels allowed by a circuit, two distinct levels of light intensity, two directions of magnetization or polarization, the orientation of reversible double stranded DNA, etc.īits can be implemented in several forms. The first programmable computer, built by Konrad Zuse, used binary notation for numbers.Ī bit can be stored by a digital device or other physical system that exists in either of two possible distinct states. Vannevar Bush had written in 1936 of "bits of information" that could be stored on the punched cards used in the mechanical computers of that time. Tukey, who had written a Bell Labs memo on 9 January 1947 in which he contracted "binary information digit" to simply "bit". ![]() Shannon first used the word "bit" in his seminal 1948 paper " A Mathematical Theory of Communication". Ralph Hartley suggested the use of a logarithmic measure of information in 1928. The encoding of text by bits was also used in Morse code (1844) and early digital communications machines such as teletypes and stock ticker machines (1870). In all those systems, the medium (card or tape) conceptually carried an array of hole positions each position could be either punched through or not, thus carrying one bit of information. A variant of that idea was the perforated paper tape. The encoding of data by discrete bits was used in the punched cards invented by Basile Bouchon and Jean-Baptiste Falcon (1732), developed by Joseph Marie Jacquard (1804), and later adopted by Semyon Korsakov, Charles Babbage, Herman Hollerith, and early computer manufacturers like IBM. Use of the latter may create confusion with the capital "B" which is the international standard symbol for the byte. The symbol for the binary digit is either "bit" as per the IEC 80000-13:2008 standard, or the lowercase character "b", as per the IEEE 1541-2002 standard. As a unit of information, the bit is also known as a shannon, named after Claude E. In information theory, one bit is the information entropy of a random binary variable that is 0 or 1 with equal probability, or the information that is gained when the value of such a variable becomes known. Frequently, half, full, double and quadruple words consist of a number of bytes which is a low power of two. ![]() It may be physically implemented with a two-state device.Ī contiguous group of binary digits is commonly called a bit string, a bit vector, or a single-dimensional (or multi-dimensional) bit array.Ī group of eight bits is called one byte, but historically the size of the byte is not strictly defined. The relation between these values and the physical states of the underlying storage or device is a matter of convention, and different assignments may be used even within the same device or program. These values are most commonly represented as either " 1" or " 0", but other representations such as true/ false, yes/ no, on/ off, or +/ − are also widely used. The bit represents a logical state with one of two possible values. The name is a portmanteau of binary digit. The bit is the most basic unit of information in computing and digital communications.
0 Comments
Leave a Reply. |