Mark Hookey’s Post

View profile for Mark Hookey, graphic

CEO @ Demyst | Data Workflows, Digital, Automation

How can this continue? Even if you buy the argument that Nvidia has a unique long term advantage in providing hardware for AI applications (I don't), how can consumption continue at this rate? Way back when I was more heavily involved in building the underlying algorithms to train ML/AI models, it was very clear that the compute for training a model was 100, 1000, or 10000x the compute needed to apply a model (predict). I don't know the numbers but GenAI is even more extreme. People talk about $millions+ to train, and cents to predict. This gold rush on chips seems predicated on the assumption that AI applications will grow exponentially (they probably will) whereas the real driver of compute is building new AI models, not their application, why will building models grow exponentially?

View profile for Charles-Henry Monchau, CFA, CMT, CAIA, graphic

Chief Investment Officer & Member of the Executive Committee at Syz Group ¦ 190,000+ followers

All German stocks are now worth less than Nvidia. Germany's stock market value is just ~$2.5tn, Nvidia's value is $2.62tn. This week, Nvidia has gained $348bn in value, the combined value of traditional comps Siemens, Deutsche Telekom, and Mercedes. Source: HolgerZ, Bloomberg

  • No alternative text description for this image
Mark Hookey

CEO @ Demyst | Data Workflows, Digital, Automation

2mo

20% down the last month

Like
Reply
Nissan Dookeran

I help teams by leading innovation thru disruptive but practical approaches to applied AI and Blockchain consulting, architecture and development

4mo

You should take a look at Jeremy Howard's team's work towards reducing the compute needed to train Gen AI models down to just 2 Nvidia GPUs. See https://www.answer.ai/posts/2024-03-06-fsdp-qlora.html

Like
Reply
See more comments

To view or add a comment, sign in

Explore topics