Exclusive-Meta begins testing its first in-house AI training chip
Facebook owner Meta is testing its first in-house chip for training artificial intelligence systems, a key milestone as it moves to design more of its own custom silicon and reduce reliance on external suppliers like Nvidia, two sources told Reuters.
The world's biggest social media company has begun a small deployment of the chip and plans to ramp up production for wide-scale use if the test goes well, the sources said.
Reducing Infrastructure Costs
The push to develop in-house chips is part of a long-term plan at Meta to bring down its mammoth infrastructure costs as the company places expensive bets on AI tools to drive growth. Meta, which also owns Instagram and WhatsApp, has forecast total 2025 expenses of $114 billion to $119 billion, including up to $65 billion in capital expenditure largely driven by spending on AI infrastructure.
Meta is reportedly testing its first RISC-V based AI chip for AI ...
Dedicated AI Accelerator
One of the sources said Meta's new training chip is a dedicated accelerator, meaning it is designed to handle only AI-specific tasks. This can make it more power-efficient than the integrated graphics processing units (GPUs) generally used for AI workloads.
Meta executives have said they want to start using their own chips by 2026 for training, or the compute-intensive process of feeding the AI system reams of data to "teach" it how to perform.
Meta is working with Taiwan-based chip manufacturer TSMC to produce the chip, this person said.
Development Process
The test deployment began after Meta finished its first "tape-out" of the chip, a significant marker of success in silicon development work that involves sending an initial design through a chip factory, the other source said.
Meta previously pulled the plug on an in-house custom inference chip after it flopped in a small-scale test deployment similar to the one it is doing now for the training chip, instead reversing course and placing orders for billions of dollars worth of Nvidia GPUs in 2022.
Meta Training and Inference Accelerator (MTIA) Series
The chip is the latest in the company's Meta Training and Inference Accelerator (MTIA) series. The program has had a wobbly start for years and at one point scrapped a chip at a similar phase of development.
Alphawave Semi Tapes Out Industry-First, Multi-Protocol I/O ...
Future Plans
As with the inference chip, the goal for the training chip is to start with recommendation systems and later use it for generative AI products like chatbot Meta AI, the executives said.
We're working on how would we do training for recommender systems and then eventually how do we think about training and inference for gen AI," Meta's Chief Product Officer Chris Cox said at the Morgan Stanley technology, media and telecom conference last week.
Cox described Meta's chip development efforts as "kind of a walk, crawl, run situation" so far, but said executives considered the first-generation inference chip for recommendations to be a "big success."