Home   >>   tech  >> Meta’s Next-Gen AI Chips To Be Faster Than Ever Before

Meta’s Next-Gen AI Chips To Be Faster Than Ever Before

Team Gossip  |   Apr 12, 4:49 AM   |   6 min read

banner image

Highlights

  • Meta has promised that its latest iteration of its custon AI chips will be much more advanced.

  • The new chips will aim to make training more efficient and inference, making the actual reasoning task easier.

  • Meta launched MTIA v1 last year in May, with an aim to provide these chips to data centers.

Meta has promised that its latest iteration of its custom AI chips will be much more advanced and will be capable of training its ranking models faster than ever before. The Meta Training and Inference Accelerator (MTIA) is created in order to work seamlessly with the company’s ranking and recommendation models.

 

The new chips will aim to make training more efficient and inference, making the actual reasoning task easier. In a blog post, Meta noted that MTIA holds importance in its long-term plan to build infrastructure around how it leverages artificial intelligence on its platforms. Meta wants to design its chips to align with current technology infrastructure and future innovations in the world of GPUs.

 

“Meeting our ambitions for our custom silicon means investing not only in compute silicon but also in memory bandwidth, networking, and capacity as well as other next-generation hardware systems,” Meta’s blog post reads.

 

Also Read: Motorola To Launch Edge 50 Smartphone Series During April 16 Event

 

How Do Meta’s MTIA Chips Function?

 

Xbox Cloud Gaming coming to Meta Quest 3

 

Meta launched MTIA v1 last year in May, with an aim to provide these chips to data centers. The next-gen MTIA chip will also be targeted at data centers. MTIA v1 wasn’t supposed to launch before 2025, but Meta suggested both chips are already in the production phase.

 

As of now, MTIA focuses on training ranking and recommendation algorithms, but the company suggests its ultimate goal is to extend the chip’s abilities to start training gen-AI like Llama language models.

 

Meta claims the new MTIA chip “is fundamentally focused on providing the right balance of compute, memory bandwidth, and memory capacity.” The new chip will feature 256MB memory on-chip with 1.3GHz in contrast to v1’s 128MB and 800GHz.

 

Early findings from the company suggest the new chip offers thrice the performance compared to its precursor. The tests were done across the four models the company evaluated. Meta has the MTIA v2 in the works for a while now. The project was internally dubbed Artemis and was earlier tipped to focus only on inference.

 

Also Read: Apple Loosens App Store Restrictions To Enable Emulators On IOS

 

Other AI Companies Jumping The Bandwagon

 

There has been a significant rise in AI companies planning to build their own chips as the demand for computing power grows alongside the use of artificial intelligence. Google launched its TPU chips back in 2017, while Microsoft launched its Maia 100 chips. Amazon also brought its Trainium 2 chip, which was built to train foundation models four times faster than the older iteration.

 

The race to acquire powerful chips underscored the need to leverage custom chips to train AI models. Demand for chips has grown exponentially to a point where Nvidia, which rules the AI chip market, is currently valued at $2 trillion.

Trending tags

Author Avatar

Team Gossip

Gossip News Desk

logo

The Gossip News Desk is a common byline for news, features, and guides by contributing authors.

Comments

0 Comments

image

View More Comments

Latest