Disclaimer: We may earn a commission if you make any purchase by clicking our links. Please see our detailed guide here.

Follow us on:

Google News
Whatsapp

Nvidia A100 Has Risen as the ‘Workhorse’ for AI Professionals

Join the Opinion Leaders Network

Join the Techgenyz Opinion Leaders Network today and become part of a vibrant community of change-makers. Together, we can create a brighter future by shaping opinions, driving conversations, and transforming ideas into reality.

The $10,000 chip Nvidia A100, that is able to perform many simple calculations simultaneously, has become the “workhorse” for artificial intelligence (AI) professionals, says Nathan Benaich, an AI investor in State of AI Report. Usage statistics in AI research shows NVIDIA ahead of by 20-100x.

The research highlighted that Nvidia takes 95 percent of the market for graphics processors that can be used for machine learning. And the A100 is suited for the kind of machine learning models that power tools like ChatGPT, Bing AI or Stable Diffusion.

Emad Mostaque, Stability AI CEO, pointed out that there were 32 A100s a year ago. Stability AI, the company that helped develop Stable Diffusion – an image generator, has access to over 5,400 A100 GPUs. Experts say that startups and big companies are working on software like chatbots and image generators that require thousands of Nvidia’s chips. The report noted that the technology behind the A100 was initially used to render sophisticated 3D graphics in games. “The chips need to be powerful enough to crunch terabytes of data quickly to recognize patterns.”

Hyped About AI

At a call with analysts on Wednesday, Nvidia CEO Jensen Huang was all hyped about AI. “The activity around the AI infrastructure that we built, and the activity around inferencing using Hopper and Ampere to influence large language models has just gone through the roof in the last 60 days. There’s no question that whatever our views are of this year as we enter the year has been fairly dramatically changed as a result of the last 60, 90 days.” The rising demand has also soared prices. Nvidia’s DGX A100 costs nearly $200,000. In the call, Nvidia said it would sell cloud access to DGX systems directly. This would reduce the entry cost.

According to New Street Research, the ChatGPT model inside Bing’s search could require 8 GPUs to deliver a response to a question in less than one second. This suggests that Microsoft would need over 2,000 8-GPU servers just to deploy the model in Bing – a cost of $4 billion in infrastructure.

Antoine Chkaiban, a technology analyst at New Street Research, explained that if one wants to scale at the scale of Google, which serves 8 or 9 billion queries every day, one needs to spend $80 billion on DGXs. “The numbers we came up with are huge. But they’re simply the reflection of the fact that every single user taking to such a large language model requires a massive supercomputer while they’re using it.”

But Huang, in an interview with CNBC, said NVIDIA’s products are actually inexpensive for the amount of computation that these kinds of models need. “We took what otherwise would be a $1 billion data center running CPUs and we shrunk it down into a data center of $100 million. Now, $100 million, when you put that in the cloud and shared by 100 companies, is almost nothing.”

Join 10,000+ Fellow Readers

Get Techgenyz’s roundup delivered to your inbox curated with the most important for you that keeps you updated about the future tech, mobile, space, gaming, business and more.

Recomended

Partner With Us

Digital advertising offers a way for your business to reach out and make much-needed connections with your audience in a meaningful way. Advertising on Techgenyz will help you build brand awareness, increase website traffic, generate qualified leads, and grow your business.

Power Your Business

Solutions you need to super charge your business and drive growth

More from this topic