Home : Technical Terms : Gigaflops Definition

Gigaflops

Gigaflops is a unit of measurement used to measure the performance of a computer's floating point unit, commonly referred to as the FPU. One gigaflops is one billion (1,000,000,000) FLOPS, or floating point operations, per second.

The term "gigaflops" appears to be plural, since it ends in "s," but the word is actually singular since FLOPS is an acronym for "floating point operations per second." This is why gigaflops is sometimes written as "gigaFLOPS." Since gigaflops measures how many billions of floating point calculations a processor can perform each second, it serves as a good indicator of a processor's raw performance. However, since it does not measure integer calculations, gigaflops cannot be used as a comprehensive means of measuring a processor's overall performance.

Updated: January 22, 2009

Cite this definition:

http://techterms.com/definition/gigaflops

TechTerms - The Tech Terms Computer Dictionary

This page contains a technical definiton of Gigaflops. It explains in computing terminology what Gigaflops means and is one of many technical terms in the TechTerms dictionary.

All definitions on the TechTerms website are written to be technically accurate but also easy to understand. If you find this Gigaflops definition to be helpful, you can reference it using the citation links above. If you think a term should be updated or added to the TechTerms dictionary, please email TechTerms!