Technology
Chips - what would the modern world be without them? (Photo by Maxence Pira on Unsplash)
Present-day human civilisation cannot be imagined without tiny slivers of silicon called "chips."
They are everywhere — in smartphones, electronic gadgets, cars, medical equipment, aeroplanes, ships, household appliances, and most other objects that eyes can trace.
There is no aspect of day-to-day life that is devoid of these ubiquitous marvels derived from sand that consists mainly of silicon.
Still, it is only in the last few decades that our lives have become dependent on these grains of sand.
Intelligence results from collection and processing of information. Information is represented in machines as a string of ones and zeros called bits. A particular sequence of a string could represent text, while another could represent an audio and yet another a video.
A "bit" of information is loosely analogous to a neuron, the fundamental constituent of the brain in living organisms.
After the end of the Second World War, at the dawn of the computer age, we had those bulky vacuum tubes that glowed like an electric bulb and which represented one bit of information.
Some of us might remember those old, clunky radios with the vacuum tubes. But there was a limit to the number of such tubes which could be strung together with miles of wiring.
Those tubes blew up like electric bulbs. Finding and replacing the defective ones in such a network spanning several rooms were several full-time jobs in themselves.
A miniature replacement device called "transistor" was invented at the Bell Laboratories in the United States (US). It was made using a tiny bit of germanium. This was known as "solid state" because there were no glass tubes filled with gas.
Miniaturisation led to stringing hundreds of germanium bits into a board. Several such boards made up a substantial amount of intelligence. Still, each device had to be laboriously connected to other devices using wires and is prone to errors.
What led to widespread use of transistors was that Bell Labs made the technology available for a nominal fee to the industry, and offered them help to manufacture them.
Around that time, the US helped Japan to recover from the Second World War by giving them the technology. Sony was the first company to capitalise on it, and marketed the portable transistor radio to replace the bulky vacuum-tubes-based ones. Japan’s consumer electronics industry grew in this manner.
William Shockley, a co-inventor of the transistor at Bell Labs, started his own company. He would build his transistors from silicon rather than from germanium.
Silicon has a higher melting point and is difficult to work with. But it is the second-most abundant element on Earth, after oxygen. The devices made from silicon would work in almost any environment — hot, cold, wet, dry.
The next step in the journey was to etch several transistors on a small chip, with the connections between them also etched on the chip. This made the assemblies more reliable and took up much less space.
Two pioneers helped this happen. First was Jack Kilby at Texas Instruments. Second was Robert Noyce at Fairchild Semiconductors. These devices were given the name "Integrated Circuit (IC)."
From this point on, the development of chips galloped — spurred by the space race and the Cold War between the superpowers.
Chips became denser, with more and more transistors packed together — first hundreds, then thousands, then millions. They became reliable enough to be used in harsh military conditions and in the extremes of space weather.
Fairchild got a contract to build the Apollo guidance system that carried the astronauts to land on the Moon. Texas Instruments won the contract to supply chips for the Minuteman II missiles. The laser-guided bombs, helped by sophisticated chips, increased the precision and, hence, the amount of destruction caused.
Prices of the chips dropped, as millions of them were manufactured to feed the ever-increasing demand. The manufacturing processes matured as a result.
Intel started with the production of chips, which store information and are called "memories."
When they got several orders from Japan for chips to power handheld calculators, Intel decided to design and produce a general computing chip called a "microprocessor." These microprocessors are at the heart of every personal computer and smartphone.
Intel lost the battle for making memories, which is now dominated by South Korea’s Samsung and Hynix, and US’ Micron Technology. However, it is still a major force in the design and manufacturing of microprocessors.
A third class of chips, apart from memories and microprocessors, are miscellaneous chips like sensors, which measure pressure, temperature, or acceleration, GPS chips, and radio chips in smartphones used for communication with cell phone towers.
The increasing trend in the last three decades has been the rise of chip factories called fabs, which specialise only in manufacturing of chips. They take orders from chip designers.
Taiwan Semiconductor Manufacturing Company (TSMC) is the world’s largest fab — taking orders from Apple, which makes smartphones, tablets, and computers, and Nvidia, which makes graphics chips used for computer gaming and artificial intelligence (AI), among others.
TSMC makes the most sophisticated chips at the cutting edge of technology. It was set up by Taiwan to enter the high-tech manufacturing sector.
Minister K T Li tapped into Morris Chang, who left Texas Instruments after being passed over for the top job at the company. Chang is the founding father who grew Taiwan’s reputation as the world’s chip leader.
Taiwan is to chips what India is to information technology (IT) services. Taiwan is also in the crosshairs of China, which claims Taiwan to be an integral part of China.
The manufacturing of chips depends on cutting-edge machinery costing hundreds of millions of dollars called "lithographic machines." The latest of them, based on extreme ultraviolet lithography (EUV), is made by ASML, a Dutch company.
The company was formed from a group at Philips and was based on research conducted in labs across the US and Europe. Hundreds of parts for the machine are made in other countries, like Germany.
Computer-aided design is used in the design of chips. The software used is supplied by three American companies — Cadence, Synopsys, and Mentor.
Chip-making is truly an international supply chain. TSMC happens to be a choke point due to the geopolitical situation with China. Efforts are naturally underway to create fabs in other parts of the world.
As Thomas Friedman wrote in the New York Times recently, TSMC is built on the trust of its supply chains and the culture of its engineers. Establishing that trust and culture is difficult, even in places like the US.
5G smartphones depend a lot on cutting-edge chips. The US company Qualcomm designs chips used in smartphones, while Huawei is a leader in the equipment which enables the 5G functionality, notwithstanding its Chinese origin controversy.
6G is on the horizon and chips will only get more dense and complex and prove to be better for consumers.
In a human generation, chips have made significant progress. I programmed Intel’s 8085 microprocessor in the 1980s, but would have great difficulty in understanding and programming the current generation of microprocessors without any advanced training.
It is fair to say that without chips, the Moon landing would not have been possible, nor would the contemporary digital and smartphone revolutions.
The grains of sand have left an indelible impact on the world.
Writer's Note: Most of the material for this article has been drawn from observing the industry for several years and also by reading the books The Man Behind The Microchip by Leslie Berlin, The Innovators by Walter Isaacson, Chip War by Chris Miller, and Inside Intel by Tim Jackson.