What’s the H100, the chip driving generative AI? (2024)

It’s rare that a computer component sets pulses racing beyond the tech industry. But when Nvidia Corp issued a blowout sales forecast in May to send its market value above US$1 trillion, the star of the show was its latest graphics processing unit, the H100. The new data centre chip is showing investors that the buzz around generative artificial intelligence (AI) – systems that can perform a wide range of tasks at superpowered speed – is translating into real revenue, at least for Nvidia. Demand for the H100 is so great that some customers are having to wait as long as six months to receive it.

What is the H100?

The H100, whose name is a nod to computer science pioneer Grace Hopper, is a graphics processor. It’s a type of chip that normally lives in PCs and helps gamers get the most realistic visual experience. Unlike its regular counterparts, though, the chip’s 80 billion transistors are arranged in cores that are tuned to process data at high speed, not generate images. Nvidia, founded in 1993, pioneered this market with investments in technology going back almost two decades, when it bet that the ability to do work in parallel would one day make its chips valuable in applications outside of gaming.

Why is the H100 so special?

Generative AI platforms learn to complete tasks such as translating text, summarising reports and writing computer code after being trained on vast quantities of pre-existing material. The more they see, the better they become at things like recognising human speech or writing job cover letters. They develop through trial and error, making billions of attempts to achieve proficiency and sucking up huge amounts of computing power in the process. Nvidia says the H100 is four times faster than the chip’s predecessor, the A100, at training these so-called large language models, or LLMs, and is 30 times faster when replying to user prompts. For companies racing to train their LLMs to perform new tasks, that performance edge can be critical.

How did Nvidia get pole position?

It’s the world leader in so-called graphics processing units (GPUs) – the bits of a computer that generate the images you see on the screen. The most powerful GPUs, which can produce realistic-looking scenery in fast-moving video games, have multiple processing cores that perform several simultaneous computations. Nvidia’s engineers realised in the early 2000s that GPUs could be retooled to become so-called accelerators for other applications, by dividing tasks up into smaller lumps and then working on them at the same time. Just over a decade ago, AI researchers discovered that their work could finally be made practical by using this type of chip.

What’s the state of the competition?

Nvidia controls about 80 per cent of the market for the accelerators in the AI data centres operated by Amazon.com’s AWS, Alphabet’s Google Cloud and Microsoft’s Azure. Those companies’ in-house efforts to build these chips, and rival products from chipmakers such as Advanced Micro Devices (AMD) and Intel, haven’t made much of an impression on the accelerator market so far.

Why is that?

Nvidia has rapidly updated its offerings, including software to support the hardware, at a pace that no competitor has been able to match up till now. Chips such as Intel’s Xeon processors have fewer processing cores. While they’re capable of more complex data crunching, they’re much slower at working through the mountains of information typically used to train AI software. Nvidia’s data centre division posted a 41 per cent increase in revenue to US$15 billion in 2022.

SEE ALSO

What’s the H100, the chip driving generative AI? (1)

If Nvidia keeps rising like this, it will be bigger than the global economy

Are others catching up?

AMD, the second-largest maker of computer graphics chips, unveiled a version of its Instinct line in June aimed at the market that Nvidia’s products dominate. The chip, called MI300X, has more memory to handle workloads for generative AI, AMD chief executive officer Lisa Su told the audience at an event in San Francisco. “We are still very, very early in the life cycle of AI,” she said. Intel is bringing specific chips for AI workloads to the market but acknowledged that, for now, demand for data centre graphics chips is growing faster than for the central processor units that were traditionally the company’s strength. Nvidia’s advantage isn’t just in the performance of its hardware. The company invented something called Cuda, a language for its graphics chips that allows them to be programmed for the type of work that underpins AI programs. BLOOMBERG

What’s the H100, the chip driving generative AI? (2024)

FAQs

What’s the H100, the chip driving generative AI? ›

The H100, whose name is a nod to computer science pioneer Grace Hopper, is a graphics processor. It's a type of chip that normally lives in PCs and helps gamers get the most realistic visual experience.

What is H100 in AI? ›

The H100 is the most powerful GPU chip on the market and is designed for artificial intelligence (AI) applications. The development of chips like the H100 represents a confluence of advanced materials science and AI, leading to significant improvements in processor quality, performance, and efficiency.

Why is the Nvidia H100 so popular? ›

The H100 processor has enabled a new generation of artificial intelligence tools that promise to transform entire industries, propelling its developer Nvidia Corp. to become one of the world's most valuable companies.

How much is the Nvidia H100? ›

The H100 is Nvidia's current top-of-line data center GPU and costs roughly $25,000 per GPU, according to a slide in an earlier company presentation that showed a 16 H100 GPU system costs $400,000.

What is the most famous generative AI? ›

The best generative AI tools at a glance
CategoryBest for
Wondershare FilmoraAI video toolsAI video editing
MidjourneyAI image toolsHigh-quality results
Adobe PhotoshopAI image toolsAI-powered editing
DALL·E 3AI image toolsEase of use
16 more rows
Jun 7, 2024

Why is H100 so expensive? ›

A single h100 you can hold on your hand costs the around the same as an entire Tesla model 3. Less chance of fatal-to-operator malfunctioning tho. Cost to produce doesn't matter. The reason these things are so expensive is demand and the limited capacity of fabs to produce them.

How many H100 does Tesla have? ›

Many of the investors have also backed Tesla, which has deployed some 35,000 Nvidia H100s for training self-driving vehicles, as well as developing supercomputers using its custom Dojo chips.

What is the Nvidia H100 used for? ›

H100 with MIG lets infrastructure managers standardize their GPU-accelerated infrastructure while having the flexibility to provision GPU resources with greater granularity to securely provide developers the right amount of accelerated compute and optimize usage of all their GPU resources.

Who manufactures the H100 chip? ›

The H100 is a graphics processing unit (GPU) chip manufactured by Nvidia. It is the most powerful GPU chip on the market and is designed specifically for artificial intelligence (AI) applications. The H100 contains 80 billion transistors, which is 6 times more than its predecessor, the A100 chip.

Who is buying H100s? ›

Who is buying up Nvidia's A.I. chips? Microsoft and Meta (META) are two of the largest buyers of Nvidia's H100 chips. The tech companies spent a combined $9 billion on the accelerators in 2023 alone, according to analysts at DA Davidson, while Omdia Research estimates the two companies acquired 150,000 chips each.

Who is Nvidia H100 competitor? ›

AMD MI300X performance compared with Nvidia H100 — low-level benchmarks testing cache, latency, inference, and more show strong results for a single GPU. MI300X has a very strong architecture, based on testing by Chips and Cheese.

How much is H100 in the US? ›

NVIDIA H100 80GB GPU - Our Price $30,970.79.

How many H100 nvidia sold? ›

Omdia says that Nvidia sold nearly half a million A100 and H100 GPUs, and demand for these products is so high that the lead time of H100-based servers is from 36 to 52 weeks.

Who is the leader in AI right now? ›

NVIDIA Corp (NVDA)

Today, NVIDIA continues to be at the forefront of AI and is developing software, chips and AI-related services.

Is Alexa a generative AI? ›

Generative AI will drive the next step function change in ambient intelligence, enabling more personalized and more natural Alexa experiences.

What company is leading in generative AI? ›

China is dominating the patent race for generative AI, with Tencent and Baidu topping the list.

What is H100 used for? ›

The NVIDIA H100 Tensor Core GPU powered by the NVIDIA Hopper GPU architecture delivers the next massive leap in accelerated computing performance for NVIDIA's data center platforms. H100 securely accelerates diverse workloads from small enterprise workloads, to exascale HPC, to trillion parameter AI models.

What is the difference between A100 and H100? ›

If you need top-tier double-precision performance and superior memory bandwidth, or you're dealing with next-gen HPC at datacenter scale and trillion-parameter AI, the H100 is the clear winner. For a more versatile and cost-effective solution that still delivers powerful AI performance, the A100 is a solid choice.

How many H100s does OpenAI have? ›

OpenAI Sora video tool large-scale deployment uses 720,000 NVIDIA H100 GPUs worth $21.6 billion.

How many H100 does Meta have? ›

The future of Meta's AI infrastructure

By the end of 2024, we're aiming to continue to grow our infrastructure build-out that will include 350,000 NVIDIA H100s as part of a portfolio that will feature compute power equivalent to nearly 600,000 H100s.

Top Articles
Is It Hard to Get Your First Credit Card in Canada? Tips & Next Steps
FPURX - Fidelity ® Puritan ® Fund
Katie Pavlich Bikini Photos
Gamevault Agent
Hocus Pocus Showtimes Near Harkins Theatres Yuma Palms 14
Free Atm For Emerald Card Near Me
Craigslist Mexico Cancun
Hendersonville (Tennessee) – Travel guide at Wikivoyage
Doby's Funeral Home Obituaries
Vardis Olive Garden (Georgioupolis, Kreta) ✈️ inkl. Flug buchen
Select Truck Greensboro
Things To Do In Atlanta Tomorrow Night
How To Cut Eelgrass Grounded
Pac Man Deviantart
Alexander Funeral Home Gallatin Obituaries
Craigslist In Flagstaff
Shasta County Most Wanted 2022
Energy Healing Conference Utah
Testberichte zu E-Bikes & Fahrrädern von PROPHETE.
Aaa Saugus Ma Appointment
Geometry Review Quiz 5 Answer Key
Walgreens Alma School And Dynamite
Bible Gateway passage: Revelation 3 - New Living Translation
Yisd Home Access Center
Home
Shadbase Get Out Of Jail
Gina Wilson Angle Addition Postulate
Celina Powell Lil Meech Video: A Controversial Encounter Shakes Social Media - Video Reddit Trend
Walmart Pharmacy Near Me Open
Dmv In Anoka
A Christmas Horse - Alison Senxation
Ou Football Brainiacs
Access a Shared Resource | Computing for Arts + Sciences
Pixel Combat Unblocked
Cvs Sport Physicals
Mercedes W204 Belt Diagram
Rogold Extension
'Conan Exiles' 3.0 Guide: How To Unlock Spells And Sorcery
Teenbeautyfitness
Weekly Math Review Q4 3
Facebook Marketplace Marrero La
Nobodyhome.tv Reddit
Topos De Bolos Engraçados
Gregory (Five Nights at Freddy's)
Grand Valley State University Library Hours
Holzer Athena Portal
Hampton In And Suites Near Me
Stoughton Commuter Rail Schedule
Bedbathandbeyond Flemington Nj
Free Carnival-themed Google Slides & PowerPoint templates
Otter Bustr
Selly Medaline
Latest Posts
Article information

Author: Arline Emard IV

Last Updated:

Views: 5554

Rating: 4.1 / 5 (52 voted)

Reviews: 91% of readers found this page helpful

Author information

Name: Arline Emard IV

Birthday: 1996-07-10

Address: 8912 Hintz Shore, West Louie, AZ 69363-0747

Phone: +13454700762376

Job: Administration Technician

Hobby: Paintball, Horseback riding, Cycling, Running, Macrame, Playing musical instruments, Soapmaking

Introduction: My name is Arline Emard IV, I am a cheerful, gorgeous, colorful, joyous, excited, super, inquisitive person who loves writing and wants to share my knowledge and understanding with you.