Meta unveils in-house AI chip

 


Meta has unveiled more details about the in-house development of its custom AI chips. 

The company announced its first-generation custom chip for running AI models. The Meta Training and Inference Accelerator (MTIA) is designed for AI inference tasks, when an already trained AI model makes predictions or draws conclusions based on new unseen data.

  • The AI inference chip line will help to power some of Meta's recommendation algorithms for ads and content in news feeds.
  • According to Meta's VP of Infrastructure, Santosh Janardhan, the MTIA chip offers improved computational power and efficiency compared to CPUs.
  • By combining MTIA chips with GPUs, Meta said it expects improved efficiency and performance across its AI workloads.
  • Meta is also working on a more advanced chip capable of both training and inference.
  • Meta also announced a new computer chip called MSVP, or Meta Scalable Video Processor, for video transcoding, or processing and transmitting video to users.
  • It shared its plans to revamp its data centers with modern AI-focused networking and cooling systems, with construction on its first facility set to begin later this year.
  • The company and its partners have now completed the second phase of its AI Research SuperCluster (RSC) supercomputer, one of the fastest in the world.

Post a Comment

Previous Next

Contact Form