MIPS

Stands for "Million Instructions Per Second." It is a method of measuring the raw speed of a computer's processor. Since the MIPS measurement doesn't take into account other factors such as the computer's I/O speed or processor architecture, it isn't always a fair way to measure the performance of a computer. For example, a computer rated at 100 MIPS may be able to computer certain functions faster than another computer rated at 120 MIPS.

The MIPS measurement has been used by computer manufacturers like IBM to measure the "cost of computing." The value of computers is determined in MIPS per dollar. Interestingly, the value of computers in MIPS per dollar has steadily doubled on an annual basis for the last couple of decades.

Updated in 2006 by Per C.

quizTest Your Knowledge

Which of the following is a networking protocol designed by Apple?

A
APFS
0%
B
AFP
0%
C
API
0%
D
ASCII
0%
Correct! Incorrect!     View the AFP definition.
More Quizzes →

The Tech Terms Computer Dictionary

The definition of MIPS on this page is an original definition written by the TechTerms.com team. If you would like to reference this page or cite this definition, please use the green citation links above.

The goal of TechTerms.com is to explain computer terminology in a way that is easy to understand. We strive for simplicity and accuracy with every definition we publish. If you have feedback about this definition or would like to suggest a new technical term, please contact us.

Sign up for the free TechTerms Newsletter

How often would you like to receive an email?

You can unsubscribe or change your frequency setting at any time using the links available in each email.

Questions? Please contact us.