Amazon has begun development of its third generation AI chip, Trainium2, at a facility north of Austin, Texas. This is reported by Bloomberg News, which has interviewed several senior executives within the company. Trainium2 is part of Amazon's strategy to take control of its chip production and reduce its dependence on external suppliers such as Nvidia, which is currently a major supplier of AI chips to Amazon's data centers.
Amazon has long used Nvidia chips for its AI-based services, but the company now aims to replace them with its own, custom-built chips optimized for its specific needs. It's part of Amazon's larger push to develop its own hardware to power its services more cost-effectively and flexibly.
– Nvidia is a very, very competent company that does excellent work., so they will have a good solution for many customers for a long time to come, says James Hamilton, senior vice president at Amazon, and a central figure in the decision to invest in its own chips. Hamilton was the one who convinced Amazon founder and then-CEO Jeff Bezos to invest in developing its own chip technology to better meet future needs.
– We are confident that we can manufacture a component that can compete with them from top to toe., Hamilton adds. It's a strong indication that Amazon believes it can beat established vendors in the market, such as Nvidia, by designing and manufacturing its own chips.
Trainium2: A crucial milestone in Amazon's strategy
Trainium2 is seen as a “Make-or-break” moment for Amazon, according to chip industry experts. According to the so-called three generation rule, which is a well-established principle in chip development, Trainium2 must deliver good enough performance to justify the investment the company has made in its development. If the chip cannot meet industry standards and expectations, Amazon risks having to reevaluate its investment and perhaps seek other alternatives.
– I have literally never seen a product deviate from the three-generation rule, says Naveen Rao, a chip industry veteran and current AI expert at Databricks. Rao is a well-known figure who oversees the development of AI solutions on Databricks, which recently entered into an agreement with Amazon to use Trainium2 to power its AI tools.
It is thus a critical point in Amazon's long-term plan to become more self-sufficient in terms of the advanced chips required to run AI solutions on a large scale.
Collaboration with Databricks and increased performance
In October 2023, Databricks announced an agreement with Amazon in which the company agreed to train its AI models on Trainium2, as part of their long-term collaboration with Amazon's cloud service, AWS (Amazon Web Services)Currently, Databricks' AI tools rely mainly on Nvidia chips, but the new deal sees some parts of their system gradually being replaced and instead powered by Trainium2.
According to Amazon, Trainium2 impressive specifications. The chip should have four times higher performance and three times as much memory as the previous generation. This makes Trainium2 a more powerful solution for handling the enormous amount of calculations and data required to train and run advanced AI models. Amazon expects this chip to strengthen their competitiveness, not only against Nvidia, but also other major players in the chip industry such as Intel and AMD.
Amazon has made it clear that the goal is not just to create a competitive product, but also to ensure that its internal AI infrastructure becomes more efficient and cost-effective in the long term. If Trainium2 is successful, it could set the standard for how companies design and use specialized chips to power their AI and cloud services in the future.
Amazon and the future of AI chips
Amazon's investment in its own AI chips has a clear connection to the company's long-term vision to remain a leading player in both the cloud and AI markets. Developing its own chips not only means technical innovation but also economic benefits in the long term, as it can reduce the company's dependence on external suppliers and thereby lower long-term costs.
For more information about AI chips and their development, you can read more at Nvidia official website about their latest progress, or delve into Amazon's work on cloud services by visiting AWS website.







