Gigabit

A gigabit is 109 or 1,000,000,000 bits.

One gigabit (abbreviated "Gb") is equal to 1,000 megabits or 1,000,000 kilobits. It is one-eighth the size of a gigabyte (GB).

Gigabits are most often used to measure data transfer rates of local networks and I/O connections. For example, Gigabit Ethernet is a common Ethernet standard that supports data transfer rates of one gigabit per second (Gbps) over a wired Ethernet network. Modern I/O technologies, such as USB 3.0 and Thunderbolt are also measured in gigabits per second. USB 3.0 can transfer data at up to 5 Gbps, while Thunderbolt 1.0 can transfer data bidirectionally at 10 Gbps.

While gigabits and gigabytes sound similar, it is important not to confuse the two terms. Since there are eight bits in one byte, there are also eight gigabits in one gigabyte. Gigabits are most often used to describe data transfer speeds, while gigabytes are used to measure data storage.

Updated October 3, 2013

Definitions by TechTerms.com

The definition of Gigabit on this page is an original TechTerms.com definition. If you would like to reference this page or cite this definition, you can use the green citation links above.

The goal of TechTerms.com is to explain computer terminology in a way that is easy to understand. We strive for simplicity and accuracy with every definition we publish. If you have feedback about the Gigabit definition or would like to suggest a new technical term, please contact us.

Want to learn more tech terms? Subscribe to the daily or weekly newsletter and get featured terms and quizzes delivered to your inbox.

Sign up for the free TechTerms Newsletter

How often would you like to receive an email?

You can unsubscribe or change your frequency setting at any time using the links available in each email.

Questions? Please contact us.