Nvidia's H100 AI GPU shortages ease as lead times drop from up to four months to 8-12 weeks
The supply constraints of Nvidia's H100 AI GPUs is finally evaporating
According to Digitimes, Dell Taiwan General Manager Terence Liao reports that the delivery lead times of Nvidia H100 AI GPUs have been reduced over the past few months from 3-4 months, as we previously reported, down to just 2-3 months (8-12 weeks). Server ODMs revealed that supply is finally easing up compared to 2023 when it was virtually impossible to attain Nvidia's H100 GPUs.
Despite decreasing lead times, Liao states that demand for AI-capable hardware is still extremely high. Specifically, AI server purchases are supplanting general-purpose server purchases in businesses, even though AI servers are incredibly costly. However, he believes that procurement time is the only reason for this.
A 2-3 month delivery window is the shortest lead time seen for Nvidia's H100 GPUs. Just six months ago, lead times reached 11 months, meaning that most of Nvidia's customers had to wait a year to have their AI GPU orders fulfilled.
Since the start of 2024, lead times have decreased significantly. First, we saw a huge reduction to 3-4 months earlier in the year. Now, lead times have shrunk by another month. At this rate, we could see lead times evaporate completely by the end of the year or sooner.
This behavior is potentially the result of knock-on effects from some companies having a surplus of H100 GPUs and re-selling some of their supply to offset high maintenance costs of unused inventory. Additionally, AWS has made it easier to rent Nvidia H100 GPUs through the cloud, which is also helping alleviate some of the H100 demand.
The only remaining Nvidia customers struggling with supply constraints are large companies like OpenAI, which are developing their own LLMs. These companies need tens and hundreds of thousands of GPUs to train their LLMs quickly and effectively.
The good news is that this shouldn't be a problem for long. If lead times continue to get exponentially shorter, as they have in just the past four months, Nvidia's largest customers should be able to get their hands on all the GPUs they need, at least in theory.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Aaron Klotz is a contributing writer for Tom’s Hardware, covering news related to computer hardware such as CPUs, and graphics cards.