The Evolution of Semiconductors: From History to Modern Innovations
- Phelan Aromana
- Dec 24, 2024
- 5 min read
Updated: Dec 26, 2024
A Brief History of Semiconductors

The journey of semiconductors began in the early 18th and early 19th centuries when a bunch of scientists explored the principles of electricity and conductivity. My personal favorite is the kite experiment Benjamin Franklin conducted in 1752 because of how simple and ‘no shit’ the experiment sounds. To be fair, he made a groundbreaking discovery at that time. Franklin paved the way for the modern understanding of electricity and led to the invention of the lightning rod, which protects buildings by safely directing electrical discharges into the ground. Not to mention the experiment was very dangerous to conduct and attempts by others to replicate it often led to fatalities.
However, the first pivotal discoveries:
In 1831, Faraday discovered that a changing magnetic field could induce an electric current in a conductor. He demonstrated this by moving a magnet through a coil of wire, which generated a current. This principle forms the basis for the operation of electric generators and transformers. This discovery laid the foundation for modern electric power generation and distribution.
In 1834, Faraday observed the semiconductor behavior of silver sulfide, noting that its resistance decreased with temperature—an unusual property for a material. Michael Faraday’s laws of electrolysis laid the foundation for understanding how electric currents interact with chemical substances. While Faraday’s laws of electrolysis are primarily concerned with electrochemical reactions, their principles are deeply integrated into the processes and applications of semiconductors. Faraday's insights continue to influence modern technology, from precise material deposition during chip fabrication to the operation of electrochemical semiconductor devices.
Faraday’s Laws of Electrolysis

First Law:The amount of chemical substance deposited or liberated at an electrode during electrolysis is directly proportional to the quantity of electric charge passed through the electrolyte.
Second Law:The masses of different substances liberated or deposited by the same amount of electric charge are proportional to their chemical equivalents (molar masses divided by valency).
The term "semiconductor" was first coined in the mid-19th
century, but its practical applications didn't take off until much later. In 1874, German physicist Ferdinand Braun discovered the rectifying properties of metal-sulfur junctions, laying the groundwork for the diode, a critical component in electronic circuits used for signal rectification and control. Braun observed the rectifying behavior of metal-semiconductor junctions, where current flows more easily in one direction than the other.

Fun fact: In 1909, Braun shared the Nobel Prize in Physics with Guglielmo Marconi for their contributions to the development of wireless telegraphy.
When I was younger, in Rome, the train station close to the store we used to get our electronics fixed and modified was called "Marconi".
By the 20th century, advancements like the invention of the vacuum tube in 1904 provided a glimpse of the potential for electronic control, but vacuum tubes were bulky, inefficient, and prone to failure. The game-changing moment came in 1947 when John Bardeen, Walter Brattain, and William Shockley at Bell Labs invented the transistor—a compact, reliable, and efficient alternative to the vacuum tube. The transistor revolutionized electronics, making semiconductors a cornerstone of modern technology. This discovery earned them the Nobel Prize in Physics in 1956.
The subsequent decades witnessed rapid development. In 1958, Jack Kilby of Texas Instruments and Robert Noyce of Fairchild Semiconductor independently invented the integrated circuit (IC), enabling the creation of compact, high-performance electronic devices. This innovation marked the beginning of the microelectronics era, characterized by the miniaturization of components and exponential growth in computing power, as famously described by Moore's Law.
Moore’s Law originally stated that:"The number of transistors on a microchip doubles approximately every two years, while the cost of computing power is halved."
This means that as transistor density increases, chips become more powerful, efficient, and affordable. Moore later refined the time frame to about 18 months.
The Role of Semiconductors in Modern Technology
Today, semiconductors are the backbone of virtually every electronic device. From smartphones and laptops to medical devices and satellites, semiconductors power our digital world. Their versatility lies in their ability to act as insulators or conductors under different conditions, enabling complex functionalities in circuits.

The semiconductor industry has evolved into a multi-trillion-dollar global market. Leading companies like Intel, Samsung, TSMC, and NVIDIA dominate the space, producing advanced chips used in applications ranging from consumer electronics to artificial intelligence (AI) and autonomous vehicles.
Current Trends and Innovations
AI and Machine Learning
Specialized chips, such as GPUs and TPUs (Tensor Processing Units), are optimized for AI workloads. These chips are driving advancements in areas like natural language processing, computer vision, and robotics.
5G and Beyond
Semiconductors are pivotal in the rollout of 5G technology, enabling faster and more reliable communication networks. Companies are already investing in research for 6G, which promises even greater speeds and connectivity.
Quantum Computing
Research into quantum semiconductors aims to unlock unprecedented computing power. Quantum chips, leveraging principles of quantum mechanics, could revolutionize fields like cryptography and drug discovery.
Sustainability
As demand for semiconductors grows, so does the need for sustainable manufacturing practices. Efforts are underway to minimize the environmental impact of chip production by reducing water and energy consumption and adopting recyclable materials.
The Future of Semiconductors
The semiconductor industry's future is both exciting and challenging. As devices become more interconnected through the Internet of Things (IoT), the demand for efficient, low-power chips will surge. Simultaneously, the industry must navigate technical hurdles like the physical limits of transistor miniaturization and geopolitical complexities. In the coming decades, we can expect breakthroughs in materials science, such as the adoption of graphene and other two-dimensional materials, which promise superior performance compared to traditional silicon. Innovations in chip architecture, such as 3D stacking and neuromorphic computing, will also play a critical role in shaping the future.
There is a partnership between the United States and Taiwan in the semiconductor industry.

Taiwan produces over 60% of the world’s semiconductors and more than 90% of advanced chips, this dominance positions Taiwan as an indispensable player in the global tech ecosystem. Taiwan is home to the leading semiconductor company, TSMC. TSMC is the world's largest and most advanced semiconductor foundry, producing cutting-edge chips for leading companies like Apple, NVIDIA, AMD, and Qualcomm. TSMC specializes in high-performance nodes such as 3nm and 5nm, which are critical for modern applications like AI, 5G, and autonomous vehicles. China views Taiwan as a key piece in its ambition to achieve technological self-sufficiency. Any disruption in Taiwan, whether due to political instability or conflict, could severely impact global chip supplies. The U.S.-Taiwan semiconductor partnership is crucial for maintaining technological leadership against competitors like China and the European Union.
Stop being lazy and do your own research when you need to
Song of the day: Better things - Zakhar
Writers: Age








Comments