Home : Hardware Terms : Integrated Circuit Definition

Integrated Circuit

An integrated circuit, or IC, is small chip that can function as an amplifier, oscillator, timer, microprocessor, or even computer memory. An IC is a small wafer, usually made of silicon, that can hold anywhere from hundreds to millions of transistors, resistors, and capacitors. These extremely small electronics can perform calculations and store data using either digital or analog technology.

Digital ICs use logic gates, which work only with values of ones and zeros. A low signal sent to to a component on a digital IC will result in a value of 0, while a high signal creates a value of 1. Digital ICs are the kind you will usually find in computers, networking equipment, and most consumer electronics.

Analog, or linear ICs work with continuous values. This means a component on a linear IC can take a value of any kind and output another value. The term "linear" is used since the output value is a linear function of the input. For example, a component on a linear IC may multiple an incoming value by a factor of 2.5 and output the result. Linear ICs are typically used in audio and radio frequency amplification.

Cite this definition:


TechTerms - The Tech Terms Computer Dictionary

This page contains a technical definiton of Integrated Circuit. It explains in computing terminology what Integrated Circuit means and is one of many hardware terms in the TechTerms dictionary.

All definitions on the TechTerms website are written to be technically accurate but also easy to understand. If you find this Integrated Circuit definition to be helpful, you can reference it using the citation links above. If you think a term should be updated or added to the TechTerms dictionary, please email TechTerms!