Home : Technical Terms : MIPS Definition

MIPS

Stands for "Million Instructions Per Second." It is a method of measuring the raw speed of a computer's processor. Since the MIPS measurement doesn't take into account other factors such as the computer's I/O speed or processor architecture, it isn't always a fair way to measure the performance of a computer. For example, a computer rated at 100 MIPS may be able to computer certain functions faster than another computer rated at 120 MIPS.

The MIPS measurement has been used by computer manufacturers like IBM to measure the "cost of computing." The value of computers is determined in MIPS per dollar. Interestingly, the value of computers in MIPS per dollar has steadily doubled on an annual basis for the last couple of decades.

Cite this definition:

https://techterms.com/definition/mips

TechTerms - The Tech Terms Computer Dictionary

This page contains a technical definiton of MIPS. It explains in computing terminology what MIPS means and is one of many technical terms in the TechTerms dictionary.

All definitions on the TechTerms website are written to be technically accurate but also easy to understand. If you find this MIPS definition to be helpful, you can reference it using the citation links above. If you think a term should be updated or added to the TechTerms dictionary, please email TechTerms!