Description: AI data center expansion drives global shortage in power ICs, DRAM, and MLCCs. Lead times exceed 20 weeks, reshaping component supply chains.
The second half of 2025 has begun with an unmistakable signal — global demand for AI hardware is not slowing down. According to the latest SemiMarket Research data, procurement of AI inference accelerators, GPUs, and high-bandwidth memory (HBM) modules has surged to record levels, pushing lead times for key components such as power management ICs and high-speed interconnect chips from the traditional 8–10 weeks to well over 20 weeks.
Caption: high-performance PCBs are at the heart of AI data center hardware, driving unprecedented demand for critical components.
China’s domestic data center boom is a major driver. Leading cloud service providers have moved aggressively to lock in 2026 deliveries, pre-booking MCU, FPGA, and DDR5 DRAM inventories months ahead of schedule. This accelerated purchasing is creating ripple effects — particularly in passive components. High-voltage MLCCs are now experiencing quarterly price hikes of 10–15%, with manufacturers operating near peak capacity.
International distributors are feeling the impact on both sides. On one hand, rising unit prices can bolster revenue. On the other, extended lead times disrupt just-in-time strategies, forcing companies to rethink stock management. Hybrid stocking — combining readily available high-turnover SKUs with long-term contracts for AI-specialty parts — is emerging as a best practice.
“The AI wave is no longer confined to GPUs. Every layer of the stack — from power delivery to signal integrity — is feeling the pull,” said an industry analyst in Shenzhen.
Industry experts warn that the situation could tighten further in Q4 as hyperscalers accelerate deployments of next-gen AI inference systems. Distributors who invest in predictive inventory models and supplier diversification stand to benefit most from this ongoing market shift.