The History of the Integrated Circuit

The integrated circuit, sometimes called a ASIC, IC, or just a chip, is a series of transistors placed on a small, flat piece that is usually made of silicon. The IC is really a platform for small transistors that a small chip which can operate faster than old-fashioned large transistors which were used in previous generations. They are also far more durable and significantly cheaper to produce which allowed them to become part of many different electronic devices.


The advent of the integrated circuit revolutionized the electronics industry and paved the way for devices such as mobile phones, computers, CD players, televisions, and many appliances found around the home. In addition, the spread of the chips helped to bring advanced electronic devices to all parts of the world.


Early History of the Integrated Circuit


The beginnings of the IC really started with the inherent limitations of the vacuum tube, a large, bulky device that preceded the transistor which eventually led to the microchip. Vacuum tubes worked as an electronic circuit, but they required warming up before they could operate. Plus, they were quite vulnerable to being damaged or destroyed even by minor bumps or impacts.


With the limitations in mind, German engineer Werner Jacobi filed a patent in 1949 for a semiconductor that operated similarly to the current integrated circuit. Jacobi lined up five transistors and used them in a three-stage arrangement on an amplifier. The result as Jacobi recognized was the ability to shrink devices such as hearing aids and make them cheaper to produce.


Despite Jacobi’s invention, there appeared to be no immediate interest. Three years later, Geoffrey Dummer who worked for the Royal Radar Establishment as part of the Ministry of Defence in Britain proposed the first fully conceived idea for the integrated circuit. However, despite giving lectures about his ideas, he was never able to build one successfully. It was the failure to actually create an IC on his own that led to the movement towards the chip overseas to America.


Invention of the IC


Fast forward to 1957 when the idea of creating small, ceramic wafers that each contained one component was first proposed by Jack Kilby who worked for the US Army. His idea led to the Micromodule Program which held quite a bit of promise. However, as the project to develop this idea started to gain traction, Kilby was inspired to come up with another, even more advanced design that became the IC that we know today.


Kilby’s prototype was primitive by today’s standards, but it worked and his idea really took hold when he left the army and went to work for Texas Instruments. On September 12th, 1958, Kilby demonstrated the first working IC and applied for a patent on February 6th, 1959. Kilby’s description of the device being a work of an electronic circuit that was totally integrated led to the coining of the term, integrated circuit.


Perhaps not surprisingly, the first customers for Kilby’s invention was the US Air Force. It was not long before many common electronic devices were being designed with the IC in mind. For his part in inventing the first true integrated circuit, Kilby won the Nobel Prize in 2000. Nine years later, his work was labeled a milestone by the IEEE.


Development and Production


Although Kilby’s IC was revolutionary, it was not without problems. One of the most troubling was that his IC or chip was fashioned out of germanium. About six months after Kilby’s IC was first patented, Robert Noyce, who worked at Fairchild Semiconductor recognized the limitations of germanium and creating his own chip fashioned from silicon.


At the same time, Jay Last, who led the development team at Fairchild Semiconductor, worked on producing the first planar integrated circuit. Instead of a singular version, it would use transistors in two pairs so they could operate separately. A groove was made between the transistors so they could operate properly. Despite how revolutionary Last’s idea was and the success of the prototype, the bosses at Fairchild either didn’t understand or recognize his work, so he was let go.


Fairchild went forward and created IC chips for use in the Apollo spacecraft which went to the moon. It was this program along with using chips for satellites that spread the IC from military applications to the commercial market. It also lowered the price of the IC drastically which made it perfect for use in many electronic devices.


Noyce, who stayed at Fairchild, used an idea from Kurt Lehovec, who worked at Sprague Electric, to create the p-n junction isolation. This was a valuable new concept for the IC since it allowed the transistors placed inside to work independently of each other. This opened new possibilities for the chip and it was not long before Fairchild Semiconductor developed self-aligned gates which is what all CMOS computer chips use today.


The development of the self-aligned gates was first credited to Federico Faggin who came up with the idea in 1968 and was recognized for his work in 2010 when he received a National Medal of Technology and Innovation.


The 1960s were also dominated by many lawsuits between rival companies that had developed their own version of the microchips as they were being improved for many different types of electronic devices. However, it would be the computer that saw the greatest benefit. In the 1950s, computers were massive devices that could barely hold a few megabytes. The incorporation of the integrated chip combined with other innovations allowed computers to shrink considerably in size while gaining in memory.


Today, the IC is still a vital part of many different types of electronic devices. It is recognized as one of the most important inventions of the 20th century and has led to the elevation of Jack Kilby and Robert Noyce to be considered the inventors of the integrated chip. While Kirby was the first, Noyce added the right elements to make the IC work properly and provide it with the potential that it has demonstrated over the decades.



Recent Stories