BitThe bit (short for binary digit) is the basic unit of information in computing and telecommunications. It represents either 1 or 0 (i.e., ON or OFF, TRUE or FALSE). The standard symbol for bit is either simply bit or the lower case character b.
ByteA byte (most commonly) equals eight bits. Historically, one byte was used to represent (encode) one character of text in a computer. The upper case letter B is the standard symbol for byte.
KilobyteThe International System of Units (SI) defines the prefix kilo as 1000. Therefore, 1 kilobyte equals 1000 bytes (1000 B). However, in some areas of information technology (IT), when referring to digital storage capacity, 1 kilobyte instead denotes 1024 (2 to the 10th power, approx. 1000) bytes. The standard unit symbol for the kilobyte is kB but the symbols K, KB, and KiB are usually used when denoting kilobyte as 1024 bytes.
MegabyteThe SI unit defines the prefix mega as 1,000,000. Therefore, 1 megabyte equals 1,000,000 bytes. In some areas of IT, when referring to storage capacity, 1 megabyte instead denotes 1,048,576 (2 to the 20th power, approx. 1,000,000) bytes. The standard unit symbol for the megabyte is MB but the symbol MiB is usually used when denoting megabyte as 1,048,576 (1024x1024) bytes.
The SI unit defines the prefix giga as 1,000,000,000 (1 billion). Therefore, 1 gigabyte equals 1 billion bytes. In the same manner, as mentioned above, when referring to digital storage capacity, 1 gigabyte instead denotes 1,073,741,824 (2 to the 30th power, approx. 1,000,000,000) bytes. The standard unit symbol for the gigabyte is GB but the symbol GiB is usually used when denoting gigabyte as 1,073,741,824 (1024x1024x1024 or 2 to the 30th power) bytes.