I find quite sad the complete absence of fully open-source stacks for edge inference from the program of the The Linux Foundation's AI_Dev conference. Does the Linux Foundation really believe that the cooperative model that we can find in the Linux ecosystem isn't good enough for accelerator drivers? That somehow it is fine to accept that, in order to accelerate inference in the edge, your company is going to have to rely on some random binaries from the internet, and run in your devices a heavily patched ancient Linux version? #opensource #linuxfoundation #npu #linux
In less than 3 weeks, AI_Dev Paris will bring together leading AI developers and innovators from around the world. 🌏 This June 19-20, immerse yourself in advanced discussions, hands-on sessions, and a collaborative atmosphere that showcases the potential of open source AI. We're grateful to all our sponsors and speakers whose support makes this global gathering possible. 🎟 Don’t miss out—register today: https://lnkd.in/ewX8iENP #opensource #lfaidata #oss
It may become easier to understand if you think of the Linux Foundation more as a trade organization whose aim is to protect the commercial interest of its members, rather than a free software organization working for the community. I share the sad feeling.
fully open-source stacks for edge inference Explain how can it be done? After working with two different edge inference IP stacks & multiple platforms(hosting it) there was nothing common except client interfaces one one end or how matrix/vectors multiply on the other. The way model is compiled & inferred is totally proprietary. Making anything open source would take away the competitive advantage of silicon companies atm. Also, Linux is poor choice of OS for the AI inference in its current form. Look up for AIOS where the model keeps the control rather than being controlled(by CPU) which probably is what we should aim for.