Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
MPSimmons
5 months ago
|
parent
|
context
|
favorite
| on:
Mistral AI Launches New 8x22B MOE Model
I think 4b for this is support to be over 70GB, so definitely still heavy hardware.
bevekspldnw
5 months ago
[–]
Fucking hell, my A6000 is shy of that and I can’t reasonably justify picking up a second.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: