Amazon developing AI chips to reduce reliance on Nvidia

Amazon is investing heavily in developing its own AI chips through Annapurna Labs, an Austin-based startup it acquired in 2015 for $350 million. The company aims to boost efficiency in its data centers and reduce costs for both itself and its AWS customers, the Financial Times reports. According to Dave Brown, vice-president of compute and networking services at AWS, Amazon’s “Inferentia” line of AI chips is already 40% cheaper to run for generating responses from AI models compared to Nvidia, the current market leader.

Amazon’s capital spending is expected to reach $75 billion in 2024, with the majority focused on technology infrastructure. The company is building its AI infrastructure from the ground up, including the silicon wafers, server racks, and proprietary software and architecture. While AWS and Annapurna have yet to significantly impact Nvidia’s dominance, industry experts believe that offering customers more choice is crucial, as Nvidia currently holds a 90% market share in the AI processor market.

Related posts:

Stay up-to-date: