Amazon Expands Nvidia Partnership with New AI Chip Release.
Amazon‘s AWS cloud division unveiled its latest advancements at the Reinvent conference in Las Vegas, introducing the Trainium2 AI chip and the versatile Graviton4 processor.
The company also revealed plans to provide access to Nvidia‘s cutting-edge H200 AI graphics processing units (GPUs).
With a strategic focus on offering diverse and cost-effective cloud solutions, Amazon Web Services aims to distinguish itself by more than just selling economical Amazon-branded products.
Like its online retail approach, AWS intends to feature top-tier products, particularly sought-after GPUs from leading AI chipmaker Nvidia.

This two-pronged strategy positions AWS more competitively against its primary rival. In a similar move, Microsoft recently introduced its first AI chip, the Maia 100, while also announcing the inclusion of Nvidia H200 GPUs in its Azure cloud.
The Graviton4 processors, based on Arm architecture, boast energy efficiency compared to chips from Intel or AMD. Promising a 30% performance improvement over existing Graviton3 chips, the Graviton4 offers enhanced output for its price.
In a climate of heightened inflation, organizations seeking to continue using AWS while managing cloud expenses may find the Graviton4 a compelling option.
Already adopted by over 50,000 AWS customers, the Graviton chips have garnered attention from companies like startup Databricks and Amazon-backed Anthropic, an OpenAI competitor. These entities plan to harness the enhanced performance of Trainium2 chips, which promise four times the capability of their predecessors.
AWS has committed to operating more than 16,000 Nvidia GH200 Grace Hopper Superchips, reserved exclusively for Nvidia’s research and development purposes. Despite this exclusivity, other AWS customers will have access to alternative computing options.

The demand for Nvidia GPUs surged after the release of OpenAI’s ChatGPT, triggering a chip shortage as numerous companies rushed to integrate similar generative AI technologies into their products.
While the introduction of an AI chip by a cloud provider might typically challenge a chipmaker like Nvidia, Amazon’s simultaneous expansion of collaboration with Nvidia poses a unique scenario. AWS customers, unable to secure the latest Nvidia GPUs, now have an additional AI computing option through Amazon’s advancements.
As a dominant force in cloud computing, Amazon has rented out GPUs within its cloud infrastructure for over a decade. In 2018, it joined cloud rivals Alibaba and Google by launching an in-house developed AI processor, delivering robust computing power at an affordable rate.

Since its inception in 2006 with the EC2 and S3 services for computing and data storage, AWS has introduced over 200 cloud products. While not all have enjoyed widespread success, Amazon remains committed to investments in the Graviton and Trainium programs, indicating their anticipation of market demand.
AWS has not disclosed release dates for virtual-machine instances equipped with Nvidia H200 chips or those relying on Trainium2 silicon. However, customers can commence testing Graviton4 virtual-machine instances, which will be commercially available in the coming months.








