Gigahertz

One gigahertz is equal to 1,000 megahertz (MHz) or 1,000,000,000 Hz. It is commonly used to measure computer processing speeds. For many years, computer CPU speeds were measured in megahertz, but after personal computers eclipsed the 1,000 Mhz mark around the year 2000, gigahertz became the standard measurement unit. After all, it is easier to say "2.4 Gigahertz" than "2,400 Megahertz."

While gigahertz is most commonly used to measure processor speed, it can also measure the speed of other parts of the computer, such as the RAM and backside cache. The speed of these components, along with other parts of the computer, also impact the computer's overall performance. Therefore, when comparing computers, remember the number of gigahertz is not the only thing that matters.

Abbreviation: GHz.

Updated August 15, 2007

Definitions by TechTerms.com

The definition of Gigahertz on this page is an original TechTerms.com definition. If you would like to reference this page or cite this definition, you can use the green citation links above.

The goal of TechTerms.com is to explain computer terminology in a way that is easy to understand. We strive for simplicity and accuracy with every definition we publish. If you have feedback about the Gigahertz definition or would like to suggest a new technical term, please contact us.

Want to learn more tech terms? Subscribe to the daily or weekly newsletter and get featured terms and quizzes delivered to your inbox.

Sign up for the free TechTerms Newsletter

How often would you like to receive an email?

You can unsubscribe or change your frequency setting at any time using the links available in each email.

Questions? Please contact us.