Home : Technical Terms : Nanometer Definition

Nanometer

A nanometer (also "nanometre") is a unit of measurement used to measure length. One nanometer is one billionth of a meter, so nanometers are certainly not used to measure long distances. Instead, they serve to measure extremely small objects, such as atomic structures or transistors found in modern CPUs.

A single nanometer is one million times smaller than a millimeter. If you take one thousandth of a millimeter, you have one micrometer, or a single micron. If you divide that micron by 1,000, you have a nanometer. Needless to say, a nanometer is extremely small.

Since integrated circuits, such as computer processors, contain microscopic components, nanometers are useful for measuring their size. In fact, different eras of processors are defined in nanometers, in which the number defines the distance between transistors and other components within the CPU. The smaller the number, the more transistors that can be placed within the same area, allowing for faster, more efficient processor designs.

Intel's processor line, for example, has included chips based on the 90-nanometer, 65-nanometer, 45-nanometer, and 32-nanometer processes. In 2011, Intel released chips that were created using a 22-nanometer process. Intel's "Broadwell" design, introduced in 2014, uses a 14-nanometer process.

Abbreviation: nm

Updated: August 14, 2014

Cite this definition:

http://techterms.com/definition/nanometer

TechTerms - The Tech Terms Computer Dictionary

This page contains a technical definiton of Nanometer. It explains in computing terminology what Nanometer means and is one of many technical terms in the TechTerms dictionary.

All definitions on the TechTerms website are written to be technically accurate but also easy to understand. If you find this Nanometer definition to be helpful, you can reference it using the citation links above. If you think a term should be updated or added to the TechTerms dictionary, please email TechTerms!