Home : Bits and Bytes : Gigabit Definition

Gigabit

A gigabit is 109 or 1,000,000,000 bits.

One gigabit (abbreviated "Gb") is equal to 1,000 megabits or 1,000,000 kilobits. It is one-eighth the size of a gigabyte (GB).

Gigabits are most often used to measure data transfer rates of local networks and I/O connections. For example, Gigabit Ethernet is a common Ethernet standard that supports data transfer rates of one gigabit per second (Gbps) over a wired Ethernet network. Modern I/O technologies, such as USB 3.0 and Thunderbolt are also measured in gigabits per second. USB 3.0 can transfer data at up to 5 Gbps, while Thunderbolt 1.0 can transfer data bidirectionally at 10 Gbps.

While gigabits and gigabytes sound similar, it is important not to confuse the two terms. Since there are eight bits in one byte, there are also eight gigabits in one gigabyte. Gigabits are most often used to describe data transfer speeds, while gigabytes are used to measure data storage.

Updated: October 3, 2013

Cite this definition:

https://techterms.com/definition/gigabit

TechTerms - The Tech Terms Computer Dictionary

This page contains a technical definiton of Gigabit. It explains in computing terminology what Gigabit means and is one of many computing terms in the TechTerms dictionary.

All definitions on the TechTerms website are written to be technically accurate but also easy to understand. If you find this Gigabit definition to be helpful, you can reference it using the citation links above. If you think a term should be updated or added to the TechTerms dictionary, please email TechTerms!