top of page

Google Unveils Trillium: AI Data Centre Chip 4.7x Faster, Launching Late 2024

Google's Trillium chip is nearly five times faster than its previous version. The chip offers 4.7 times better computing performance compared to the TPU v5e. Trillium is 67% more energy efficient than its predecessor.

Trillium
Credit: TheRegister

Google's parent company, Alphabet, has announced the launch of Trillium, a new addition to its artificial intelligence (AI) data centre chip family. According to the company, Trillium is nearly five times faster than its previous version, marking a significant advancement in AI computing power.


The demand for machine learning computers has skyrocketed in recent years, growing by a factor of 1 million in the last six years alone. Sundar Pichai, CEO of Alphabet, highlighted Google's pioneering efforts in AI chips, stating, "I think Google was built for this moment, we've been pioneering (AI chips) for more than a decade."


Google's custom chips for AI data centres provide a viable alternative to Nvidia's dominant processors in the market. Alongside Google's tensor processing units (TPUs) and closely integrated software, these chips have allowed the company to capture a significant share of the market. While Nvidia currently holds around 80% of the AI data centre chip market, Google's TPUs make up the majority of the remaining 20%.


The newly unveiled Trillium chip, belonging to the sixth generation, promises a computing performance that is 4.7 times better than the TPU v5e. This chip is specifically designed to power technologies that generate text and other media from large models. Additionally, the Trillium processor boasts a 67% increase in energy efficiency compared to its predecessor, the v5e.


Google plans to make the Trillium chip available to its cloud customers in late 2024. By increasing the high-bandwidth memory capacity and overall bandwidth, Google's engineers have achieved additional performance gains. This is crucial as AI models require vast amounts of advanced memory, which has been a bottleneck in further enhancing performance.


To facilitate deployment, the Trillium chips are designed to be deployed in pods consisting of 256 chips, which can be scaled up to hundreds of pods. This scalability ensures that Google's AI data centres can handle the increasing demand for machine learning and AI applications.

 
  • Google's Trillium chip is nearly five times faster than its previous version.

  • The chip offers 4.7 times better computing performance compared to the TPU v5e.

  • Trillium is 67% more energy efficient than its predecessor.


Source: REUTERS

bottom of page