Gigaflops

Gigaflops is a unit of measurement used to measure the performance of a computer's floating point unit, commonly referred to as the FPU. One gigaflops is one billion (1,000,000,000) FLOPS, or floating point operations, per second.

The term "gigaflops" appears to be plural, since it ends in "s," but the word is actually singular since FLOPS is an acronym for "floating point operations per second." This is why gigaflops is sometimes written as "gigaFLOPS." Since gigaflops measures how many billions of floating point calculations a processor can perform each second, it serves as a good indicator of a processor's raw performance. However, since it does not measure integer calculations, gigaflops cannot be used as a comprehensive means of measuring a processor's overall performance.

Updated January 22, 2009 by Per C.

quizTest Your Knowledge

A security vulnerability that bypasses typical authentication methods is called what?

A
Hidden gate
0%
B
Secret garden
0%
C
Backdoor
0%
D
Alleyway
0%
Correct! Incorrect!     View the Backdoor definition.
More Quizzes →

The Tech Terms Computer Dictionary

The definition of Gigaflops on this page is an original definition written by the TechTerms.com team. If you would like to reference this page or cite this definition, please use the green citation links above.

The goal of TechTerms.com is to explain computer terminology in a way that is easy to understand. We strive for simplicity and accuracy with every definition we publish. If you have feedback about this definition or would like to suggest a new technical term, please contact us.

Sign up for the free TechTerms Newsletter

How often would you like to receive an email?

You can unsubscribe or change your frequency setting at any time using the links available in each email.

Questions? Please contact us.