The AI Conference reposted this
"The human brain, which is larger than any LLM, only requires about as much power as a light bulb to operate. So if we want to make AI more efficient, shouldn't we start there? Can we start there? Can neuroscience truly improve AI?" our VP Marketing Christy Maver asked a packed crowd at The AI Conference this week. [Hint: the answer is yes. Yes it can. Visit www.numenta.com to learn how.] #taic2024 #brainbasedAI #neuroscience #thousandbrainsproject
Although current DL and LLM technologies are a vast capability improvement over previous AI methods, they can sometimes generate obvious nonsense as well as suffer from dramatic inefficiencies. A basic understanding of the complete human brain, not just the cortex, should help move us in a better direction. But long-term, we might ultimately move to totally new ways of thinking.
That's a compelling analogy. It makes me wonder—what other power and performance breakthroughs might come if we learn to model AI HW/SW more closely after the brain?
Well, I agree with Jeff Hawking that the only model that we have of intelligence is the neocortex, so we should focus our attention on reverse engineering that rather than using some other contrived model!
This is the way foward for truly intelligent machines!!
Love that this made me smile!
Great things have small beginnings.
PhD Student at University of Iowa
1moI absolutely agree. One wonders if it will be at the expense of accurate memory, as humans misremeber all the time and have a limited working memory. Ever since I learned about optogenetics and the ability to probe single neurons, as well as new technologies such as being able to infer network structures from cascades (how neurons link together based on activation time, a patented technology) and reading the survey book The Self Assembling Brain, most of my research interests also align with how to improve technologies from biological representations. Congrats on the presentation and hope to hear more good news in the future!