QUICK MENU

오늘 본 상품

  • 제품명이 노출되는 영역입니다.
    AI Chips f
    US$ 7,000
1 /
TOP
Deeper Insights Lead to Success
We are striving to secure our standing as an Intelligence Information Platform that supports clients’
decision-making and promoting direction of future business, with providing trustful global market insights.
2025-04-28 AI Chips for Data Centers and Cloud 2025-2035: Technologies, Market, Forecasts
Converging&Hi-Tech/AI
IDTechEx

License Type

Electronic (1-5 users)
US$ 7,000
Electronic and 1 Hardcopy (1-5 users)
US$ 7,975
Electronic (6-10 users)
US$ 10,000
Electronic and 1 Hardcopy (6-10 users)
US$ 10,975

< Key Hightlight >

Frontier AI attracts hundreds of billions in global investment, with governments and hyperscalers racing to lead in domains like drug discovery and autonomous infrastructure. Graphics processing units (GPUs) and other AI chips have been instrumental in driving this growth of artificial intelligence, providing the compute needed for deep learning within data centers and cloud infrastructure. GPUs have been pivotal in delivering computational capabilities, being the dominant undercurrent below the wave that is large language models (LLMs) and generative AI. However, with the demand for more efficient computation, lower costs, higher performance, massively scalable systems, faster inference, and domain-specific computation, there is opportunity for other AI chips to grow in popularity.
 
As the landscape of AI chips broadens past just GPUs, with novel architectures reaching widescale commercialization, IDTechEx's report "AI Chips for Data Centers and Cloud 2025-2035: Technologies, Market, Forecasts" offers an independent analysis of the AI chip market for data centers and the cloud. This includes benchmarking current and emerging technologies, technology breakdowns, and key trends, covering current and emerging hardware architectures, advanced node technologies, and advanced semiconductor packaging, as well as information on supply chain, investments, and policy. Granular revenue forecasts from 2025 to 2035 of the data center and cloud AI chips market are provided, segmented by types of AI chips. These include GPUs, custom AI application-specific integrated circuits (ASICs) used by hyperscalers and cloud service providers (CSPs), AI-capable central processing units (CPUs), and other AI ASICs developed by both AI chip-focused startups and large vendors.

Graphics Processing Units (GPUs)
The largest systems for AI are massive scale-out HPC and AI systems - these heavily implement GPUs. These tend to be hyperscaler AI data centers and supercomputers, both of which can offer exaFLOPS of performance, on-premise or over distributed networks. NVIDIA has seen remarkable success over recent years with its Hopper (H100/H200) chips and recently released Blackwell (B200/B300) chips. AMD has also created competitive chips with its MI300 series processors (MI300X/MI325X). Chinese players are also developing solutions due to sanctions from the US on advanced chips. These high-performance GPUs continue to adopt the most advanced semiconductor technologies. One example is increased on-chip memory capacity, with top chips having over 250GB of high-bandwidth memory (HBM), enabling larger AI models with even more parameters to run on these GPUs. These chips also adopt the most advanced semiconductor packaging solutions, such as TSMC's CoWoS-L packaging, chiplets, and multi-die GPUs, as well as the most advanced process nodes (5nm and below). All of these trends and market activities are explored in detail in this report.
 
Custom AI Chips Used by Hyperscalers and Cloud Service Providers
GPUs have been fundamental for training AI models but face limitations, such as high total cost of ownership (TCO), vendor lock-in risks, low utilization for AI-specific operations, and can be overkill for specific inference workloads. An emerging strategy that hyperscalers are adopting is using systolic array-based custom AI ASICs. These have purpose-built cores for AI workloads, are cheaper per operation, are specialized for particular systems (e.g., transformers, recommender systems, etc), offer efficient inference, and give hyperscalers and CSPs the opportunity for full-stack control and differentiation without sacrificing performance. Evaluation of potential risks, key partnerships, player activity, benchmarking, and technology overviews is available with this report.
 
Other AI Chips
Other AI chips are being commercialized to disrupt GPUs, with both similar and novel computing architectures. Some large chip vendors, such as Intel, Huawei, and Qualcomm, have designed AI accelerators (e.g., Gaudi, Ascend 910, Cloud AI 100), using heterogeneous arrays of compute units (similar to GPUs), but purpose-built to accelerate AI workloads. These offer a balance between performance, power efficiency, and flexibility for specific application domains. Often, these chips will contain matrix engines and tensor cores, which are designed to execute dense linear algebra operations like GEMM (General Matrix Multiply) and BMM (Batch Matrix Multiply) with high throughputs.
 
AI chip-focused startups often take a different approach, deploying cutting-edge architectures and fabrication techniques with the likes of dataflow-controlled processors, wafer-scale packaging, spatial AI accelerators, processing-in-memory (PIM) technologies, and coarse-grained reconfigurable arrays (CGRAs). Various companies have successfully launched these systems (Cerebras, Groq, Graphcore, SambaNova, Untether AI, and others) for data centers and cloud computing, often developing rack-scale solutions for easy enterprise deployment or offering usage on their own cloud platforms. These systems perform exceptionally, especially in scale-up environments. IDTechEx's report offers comprehensive benchmarking, comparisons, key trends, technology breakdowns, and player activity.
 
Designing AI chips and supply chain
Developing an AI chip with competitive throughput for training (time-to-train) and inference (tokens per second), high-energy efficiencies (TOPS/watt), and associated software support is a stringent challenge for all chip designers. This process involves a fine balance of many steps, including selecting programming and execution models, designing optimized hardware and memory architecture, and fabrication with advanced process nodes and advanced semiconductor packaging. For instance, data center chips are adopting the most advanced process nodes from TSMC, Intel Foundry, and Samsung Foundry, using EUV (extreme ultraviolet) lithography techniques from ASML. These foundries are pushing transistor technologies past 5nm technologies using FinFET (Fin-field effect transistor) to sub-2nm nodes using GAAFET (gate-all-around FETs) with backside power delivery. Recent fabrication developments, device requirements, hardware architecture breakdowns, advanced semiconductor packaging details, supply chain, and programming model comparisons are all included throughout this report.
 
The various technologies involved in designing and manufacturing give wide breadth for future technological innovation across the semiconductor industry supply chain. Government policy and heavy investment show the prevalent interest in pushing frontier AI toward new heights, and this will require exceptional volumes of AI chips within AI data centers to meet this demand. IDTechEx forecasts this market will grow at a CAGR of 14% from 2025 to 2030, with revenues exceeding US$400 billion.
Key Aspects
Hardware evaluation, benchmarking, and comparison for AI chips
Exploring current technologies used in data center GPUs, including analysis and benchmarking of leading processors, AI chip form factors, pricing comparisons, and technology breakdowns for leading US and Chinese players.
Detailing and benchmarking hardware architectures for current market players and emerging AI chips for server CPUs, custom AI ASICs, and other AI chips, including AI accelerators using heterogeneous matrix-based systems and spatial AI accelerators.
Analysis, measurement frameworks, and historical trends of hardware components and current AI chips, including memory, memory bandwidth, throughput (and other performance metrics), scalability, pricing, efficiency, and advanced process nodes.
Breakdown of key elements to designing and manufacturing AI chips, including programming models, hardware architectures, advanced transistors, and advanced semiconductor packaging.
 
Market information
Commentary detailing drivers and barriers of key technology types, as well as expected outlooks.
Analysis of the AI chip supply chain, including capabilities of various semiconductor manufacturers and chip designers for advanced integrated circuits.
Investment information, including governmental investments, investments into advanced semiconductor packaging plants, hyperscaler capex, and chip designer revenues.
Breakdown of US policy since 2022 concerning the export of US chips to foreign nations, including analysis of which chips require export licenses.
 
Market forecast and analysis
10-year granular market forecasts separated by key AI chip technology types.
Assessment of key technological and commercial trends for AI chips used for AI data centers and cloud infrastructure.

상품 선택옵션 1 개, 추가옵션 0 개

배송비결제 주문시 결제
Inquiry

관련 보고서 추천