Amazon revealed that it is developing two chips dedicated to artificial intelligence operations called Inferentia and Trainium.
The two new chips developed by Amazon Cloud Computing (Amazon Web Services AWS) aim to help the company's customers develop the artificial intelligence large language models (LLMs) on its AWS servers with its processors without the need to use Nvidia chips, as Amazon seeks to compete with Nvidia, the leading manufacturer of artificial intelligence chips, the most important of which are Grace Hopper Superchips.
"How quickly can these companies move to develop these generative AI applications is driven by starting first on the data they have in AWS and using compute and machine learning tools that we provide," said Vice President of Technology at AWS Mai-Lan Tomsen Bukovec.
She added that many customers are used to using these servers and tools at Amazon, as it currently has over 100,000 customers using AWS in machine learning.
Amazon has expertise in developing custom electronic chips, having started ten years ago with the development of the Nitro chip, in which the company says there is at least one chip in each of its servers.