Semiconductor makers are experiencing renewed growth after pandemic-era oversupply, with AI and data center demand driving recovery across memory, packaging, and specialized components.
Amkor Technology, the world's largest U.S.-headquartered outsourced semiconductor assembly and test (OSAT) provider, exemplifies the packaging boom. Advanced packaging technologies are critical for AI chips that require high-bandwidth memory integration and thermal management.
Analog Devices cited strong demand from industrial and data center customers as AI continues driving semiconductor sales. The data center segment particularly requires specialized ASICs, FPGAs, and analog components for power management and signal processing.
Memory markets show cyclical strength. DRAM demand from AI training systems and inference workloads is tightening supply after years of cautious capacity additions. Semiconductor manufacturers learned from pandemic-era overbuilding and now expand production carefully.
Cisco announced its Silicon One G300 networking chip, emphasizing that "AI at scale demands open, standards-based networking that customers can deploy with confidence across diverse environments," according to Yousuf Khan. AI infrastructure extends beyond compute chips to networking and interconnect technologies.
ams OSRAM AG reported over EUR 500 million in design wins for its Digital Light technology, demonstrating growth in photonics and optical components. These technologies enable high-speed chip-to-chip communication essential for distributed AI training. The company anticipates modest FY26 revenue softening due to divestments and currency headwinds, but maintains its technology trajectory.
The semiconductor recovery faces selective dynamics. AI-focused segments—high-bandwidth memory, advanced packaging, networking ASICs, and photonics—see robust growth. Traditional segments remain subject to cyclical pressures.
Industry players now navigate capacity planning with caution. The pandemic boom created oversupply that depressed prices through 2023. Current AI demand is substantial but concentrated in specific technology nodes and package types. Companies invest in 2.5D and 3D packaging capabilities, CoWoS (chip-on-wafer-on-substrate) technologies, and silicon photonics rather than broad capacity expansion.
The shift toward AI-optimized hardware accelerates. Training large language models requires thousands of GPUs with high-bandwidth memory and low-latency interconnects. Inference at scale demands energy-efficient ASICs. These requirements drive semiconductor innovation beyond traditional Moore's Law scaling to heterogeneous integration and specialized architectures.

