<electronics> (IC, or "chip") A microelectronic semiconductor device consisting of many interconnected transistors and other components.
ICs are constructed ("fabricated") on a small rectangle (a "die") cut from a Silicon (or for special applications, Sapphire) wafer.
This is known as the "substrate".
Different areas of the substrate are "doped" with other elements to make them either "p-type" or "n-type" and polysilicon or aluminium tracks are etched in one to three layers deposited over the surface.
The die is then connected into a package using gold wires which are welded to "pads", usually found around the edge of the die.
Integrated circuits can be classified into analogue, digital and hybrid (both analogue and digital on the same chip). Digital integrated circuits can contain anything from one to millions of logic gates - inverters, AND, OR, NAND and NOR gates, flip-flops, multiplexors etc. on a few square millimeters.
The small size of these circuits allows high speed, low power dissipation, and reduced manufacturing cost compared with board-level integration.
The first integrated circuits contained only a few transistors.
Small Scale Integration (SSI) brought circuits containing transistors numbered in the tens.
Later, Medium Scale Integration (MSI) contained hundreds of transistors.
Further development lead to Large Scale Integration (LSI) (thousands), and VLSI (hundreds of thousands and beyond).
In 1986 the first one megabyte RAM was introduced which contained more than one million transistors.
LSI circuits began to be produced in large quantities around 1970 for computer main memories and pocket calculators.
For the first time it became possible to fabricate a CPU or even an entire microprocesor on a single integrated circuit.
The most extreme technique is wafer-scale integration which uses whole uncut wafers as components.
[Where and when was the term "chip" introduced?]