Home : Technical Terms : Gigahertz Definition

Gigahertz

One gigahertz is equal to 1,000 megahertz (MHz) or 1,000,000,000 Hz. It is commonly used to measure computer processing speeds. For many years, computer CPU speeds were measured in megahertz, but after personal computers eclipsed the 1,000 Mhz mark around the year 2000, gigahertz became the standard measurement unit. After all, it is easier to say "2.4 Gigahertz" than "2,400 Megahertz."

While gigahertz is most commonly used to measure processor speed, it can also measure the speed of other parts of the computer, such as the RAM and backside cache. The speed of these components, along with other parts of the computer, also impact the computer's overall performance. Therefore, when comparing computers, remember the number of gigahertz is not the only thing that matters.

Abbreviation: GHz.

Updated: August 15, 2007

Cite this definition:

http://techterms.com/definition/gigahertz

TechTerms - The Tech Terms Computer Dictionary

This page contains a technical definiton of Gigahertz. It explains in computing terminology what Gigahertz means and is one of many technical terms in the TechTerms dictionary.

All definitions on the TechTerms website are written to be technically accurate but also easy to understand. If you find this Gigahertz definition to be helpful, you can reference it using the citation links above. If you think a term should be updated or added to the TechTerms dictionary, please email TechTerms!