GADGET

Google announces Axion, its first Arm-based CPU for data centers

Google Cloud Next 2024 has begun, and the company is starting the event with some big announcements, including its new Axion processor. It’s Google’s first Arm-based CPU specifically created for data centers, which was designed using Arm’s Neoverse V2 CPU.

According to Google, Axion performs 30 percent better than its fastest general purpose Arm-based tools in the cloud and 50 percent better than the most recent, comparable x86-based VMs. They also claim it’s 60 percent more energy efficient than those same x86-based VMs. Google is already using Axion in services like BigTable and Google Earth Engine, expanding to more in the future.

The release of Axion could bring Google into competition with Amazon, which has led the field of Arm-based CPUs for data centers. The company’s cloud business, Amazon Web Services (AWS), released the Graviton processor back in 2018, releasing the second and third iterations over the following two years. Fellow chip developer NVIDIA released its first Arm-based CPU for data centers in 2021 named Grace, and companies like Ampere have also been making gains in the area.

Google has been developing its own processors for several years now, but they’ve been primarily focused on consumer products. The original Arm-based Tensor ship first shipped in the Pixel 6 and 6 Pro smartphones, which were released in late 2021. Subsequent Pixel phones have all been powered by updated versions of the Tensor. Prior to that, Google developed the “Tensor Processing Unit” (TPU) for its data centers. The company started using them internally in data centers in 2015, announced them publicly in 2016, and made them available to third parties in 2018.

Arm-based processors are often a lower-cost and more energy-efficient option. Google’s announcement came right after Arms CEO Rene Haas issued a warning about the energy usage of AI models, according to the Wall Street Journal. He called models such as ChatGPT “insatiable” regarding their need for electricity. “The more information they gather, the smarter they are, but the more information they gather to get smarter, the more power it takes, Haas stated. By the end of the decade, AI data centers could consume as much as 20 percent to 25 percent of US power requirements. Today that’s probably four percent or less. That’s hardly very sustainable, to be honest with you.” He stressed the need for greater efficiency in order to maintain the pace of breakthroughs.

This article contains affiliate links; if you click such a link and make a purchase, we may earn a commission.


Source link

Related Articles

Please, use our online surveys for check your audience.
Back to top button
pinup