Amazon Web Services partners with Hugging Face to streamline AI model usage

Amazon Web Services partners with Hugging Face to streamline AI model usage

In an exciting development for the future of artificial intelligence (AI), Amazon.com’s cloud unit has announced a partnership with Hugging Face, an AI startup valued at $4.5 billion. This collaboration aims to streamline the usage of thousands of AI models on Amazon’s custom computing chips, specifically the Inferentia2 chip, which is part of Amazon Web Services (AWS).

Hugging Face has built a strong reputation as a central hub for AI researchers and developers to access and work with open-source AI models. Backed by tech giants like Amazon, Google, and Nvidia, Hugging Face is the go-to platform for obtaining and experimenting with models such as Meta Platforms' popular Llama 3. However, once developers have fine-tuned an open-source model, they often seek a way to use it to power their software.

The partnership between Amazon and Hugging Face aims to address this need for efficiency and cost-effectiveness. Jeff Boudier, head of product and growth at Hugging Face, highlighted the importance of enabling as many people as possible to run AI models and ensuring they can do so in the most efficient manner. By utilizing the Inferentia2 chip, developers can take advantage of the power and cost-saving benefits offered by AWS.

Matt Wood, the leader of AWS’s artificial intelligence products, emphasized the advantage of using AWS chips for inference tasks. While Nvidia currently dominates the market for model training, AWS argues that their chips excel in executing trained models, also known as inference, at a lower cost over time. Wood stated that while models might be trained once a month, they could be running inference thousands of times an hour. This is where the Inferentia2 chip truly shines, providing a compelling reason for AI developers to consider AWS for their AI deployment needs.

The partnership between Amazon and Hugging Face demonstrates a concerted effort to optimize AI model usage on AWS chips, appealing to AI developers seeking efficiency, cost-effective solutions, and the ability to easily scale their AI applications. As the AI industry continues to grow rapidly, collaborations like this are instrumental in driving innovation and creating a robust ecosystem for AI development and deployment.

The future of AI is undoubtedly being shaped by partnerships and advancements like these. With the support of industry-leading companies like Amazon, Hugging Face is positioned to further expand its reach and empower developers to unlock the potential of AI in their applications. As Jeff Boudier aptly summarised, “One thing that’s very important to us is efficiency - making sure that as many people as possible can run models and that they can run them in the most cost-effective way.” With this partnership, Amazon and Hugging Face are paving the way for a more accessible and efficient AI landscape.


Written By

Jiri Bílek

In the vast realm of AI and U.N. directives, Jiri crafts tales that bridge tech divides. With every word, he champions a world where machines serve all, harmoniously.