Broadcom Extends Multi‑Year AI Chip Partnership with Meta Through 2029

AVGO
April 15, 2026

Broadcom Inc. and Meta Platforms announced a new extension of their AI‑chip partnership that will see Broadcom design, package and supply Meta’s next‑generation Training and Inference Accelerator (MTIA) chips through 2029. The deal builds on the existing collaboration and includes an initial commitment of more than 1 GW of compute capacity, the first phase of a sustained, multi‑gigawatt rollout that will be powered by 2‑nanometer process technology and Broadcom’s XPU platform with advanced Ethernet capabilities.

The agreement secures a long‑term, multi‑generation supply contract that will generate recurring revenue for Broadcom’s AI‑chip and networking businesses. Broadcom’s AI revenue reached $8.4 B in Q1 FY2026, up 106% year‑over‑year, underscoring the rapid growth of the segment. Meta’s plan to spend up to $135 B on AI infrastructure in 2026 places the MTIA partnership at the core of its custom silicon strategy, reducing reliance on third‑party suppliers and enabling higher performance and efficiency for its data‑center workloads.

Broadcom CEO Hock Tan will step down from Meta’s board and take on an advisory role focused on Meta’s custom silicon roadmap. Tan said, "This initial MTIA deployment is just the beginning of a sustained, multi‑generation roadmap to serve the trajectory of massive growth over the next few years that highlights Broadcom's unmatched leadership in AI networking and the power of our foundational XPU custom accelerator platform."

Meta CEO Mark Zuckerberg added, "Meta is partnering with Broadcom across chip design, packaging, and networking to build out the massive computing foundation we need to deliver personal superintelligence to billions of people. As we roll out more than 1GW of our custom silicon to start and then multiple gigawatts over time, this partnership will give us greater performance and efficiency for everything we're building."

The partnership strengthens Broadcom’s position in the AI infrastructure market and expands its customer base with a large, long‑term order. It also positions Meta to accelerate its generative‑AI initiatives, as the MTIA chips are designed for inference and low‑precision processing that is critical for large‑language‑model workloads. The multi‑generation contract signals confidence from both companies in the continued expansion of AI workloads and the need for specialized silicon to meet performance and power requirements.

The content on EveryTicker is for informational purposes only and should not be construed as financial or investment advice. We are not financial advisors. Consult with a qualified professional before making any investment decisions. Any actions you take based on information from this site are solely at your own risk.