Gigaflops is a unit of measurement used to measure the performance of a computer's floating point unit, commonly referred to as the FPU. One gigaflops is one billion (1,000,000,000) FLOPS, or floating point operations, per second.

The term "gigaflops" appears to be plural, since it ends in "s," but the word is actually singular since FLOPS is an acronym for "floating point operations per second." This is why gigaflops is sometimes written as "gigaFLOPS." Since gigaflops measures how many billions of floating point calculations a processor can perform each second, it serves as a good indicator of a processor's raw performance. However, since it does not measure integer calculations, gigaflops cannot be used as a comprehensive means of measuring a processor's overall performance.

Updated January 22, 2009 by Per C.

quizTest Your Knowledge

How many bits of color depth are needed to display 16 million colors?

Correct! Incorrect!     View the Color Depth definition.
More Quizzes →

The Tech Terms Computer Dictionary

The definition of Gigaflops on this page is an original definition written by the team. If you would like to reference this page or cite this definition, please use the green citation links above.

The goal of is to explain computer terminology in a way that is easy to understand. We strive for simplicity and accuracy with every definition we publish. If you have feedback about this definition or would like to suggest a new technical term, please contact us.

Sign up for the free TechTerms Newsletter

How often would you like to receive an email?

You can unsubscribe or change your frequency setting at any time using the links available in each email.

Questions? Please contact us.