Buy Cheap Domain & Hosting Plan

Contact Info

Chicago 12, Melborne City, USA

+88 01682648101

info@themetags.com

Get Started

Google Cloud’s Tensor Processing Units (TPUs) have surged ahead of competitors in the AI hardware market, according to a recent Omdia report. The report, titled ‘Checking in with hyperscalers’ AI chips: Spring 2024,’ reveals that Google plans to deploy approximately $6 billion worth of TPUs to its data centers in 2024.

These TPUs are integral to supporting Google’s internal projects like Gemini, Gemma, and Search, as well as handling customer workloads on the Google Cloud Platform.

While major cloud hyperscalers – including Google Cloud, Microsoft Azure, Amazon Web Services, and Meta Platforms – are developing custom AI accelerators, Google Cloud’s TPUs are notably outperforming the competition.

Impact on Google Cloud Platform Profitability Alexander Harrowell, Principal Analyst for Advanced Computing at Omdia, suggests that the success of Google’s TPUs is likely contributing to the profitability of the Google Cloud Platform. He also highlights the expanding ecosystem of semi-custom chip providers, such as Broadcom, Marvell, Alchip, and Arm plc’s Neoverse CSS service, which is shaping trends towards custom silicon in the industry.

Emergence of New AI Chip The report highlights an intriguing development: the emergence of ‘Customer C,’ a U.S.-based cloud computing company set to debut a new AI chip in 2026. Harrowell notes that the extended lead time for this chip project indicates a significant innovation.

About Omdia Omdia, part of Informa Tech, provides technology research and advisory services. Their insights help organizations make informed decisions to drive growth in the tech market.

Share this Post

Leave a Reply

Your email address will not be published. Required fields are marked *