97 Views

OpenAI Moves Into Chip Design With Broadcom as Mass Production Targeted for 2026

OpenAI is preparing to launch its first custom artificial intelligence chip, marking a major shift in the company’s strategy as it seeks to reduce dependence on Nvidia and strengthen control over its own computing infrastructure. The chip, internally referred to as an “XPU,” is being co-developed with Broadcom and is expected to enter mass production in 2026.

 

The initiative reflects a broader industry trend in which leading technology companies—including Google, Amazon, Microsoft and Meta—have invested in designing proprietary chips to handle the massive computing requirements of advanced AI systems. By following the same path, OpenAI aims to secure greater efficiency, lower costs, and long-term supply stability at a time when demand for AI computing power is skyrocketing.

 

Sources familiar with the project say the new chip will be used exclusively to power OpenAI’s internal operations and will not be sold commercially. This vertical integration would give the company more direct control over the performance of its infrastructure, helping it scale future iterations of ChatGPT and other AI models without being bound by supply constraints from external suppliers.

 

The collaboration is also a strategic win for Broadcom. Chief executive Hock Tan recently pointed to a surge in orders from a new client that will significantly boost Broadcom’s AI-related revenue in fiscal 2026, with industry analysts confirming that OpenAI is that customer. The reported order size exceeds $10 billion, underscoring the scale of the investment.

 

Manufacturing of the chip is expected to be handled by Taiwan Semiconductor Manufacturing Company (TSMC), the world’s largest contract chipmaker, which has also produced AI accelerators for Nvidia, AMD, and other hyperscalers. With design work already underway, TSMC is preparing to begin early production later this year, before ramping to full capacity in 2026.

 

For OpenAI, the move signals a pivotal step in evolving from a software-centric research lab into a vertically integrated AI platform. By controlling more of its hardware stack, the company is positioning itself to compete with the biggest players in the cloud and AI infrastructure market. It also highlights the intensifying race to secure the computational power required for the next generation of artificial intelligence.

Recent Stories


Logo Image
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.