🦙 Tool calling with Ollama We now have a partner package with Ollama to help you perform tool calling, which is now natively supported in Ollama. Tools are utilities (like APIs or custom functions) that enhance an LLM's capabilities. However, local LLMs struggle with both selecting the right tool and providing the correct input. In the video below, we use the new Ollama partner package to perform tool calling w/ the recent Groq fine-tune of Llama-3 8b. See how to create a simple tool calling agent in LangGraph with web search and vector-store retrieval tools that run locally. 🎥 Video: https://lnkd.in/erppcmdY 🐍 Partner package (Python): https://lnkd.in/ej7KUQCr 🦏 Partner package (JavaScript): https://lnkd.in/eY8giWBY 📓 Notebook: https://lnkd.in/ewfNM_dc
This is a fantastic partnership! The Ollama package will be a game-changer for local LLMs struggling with tool selection and input provision. Looking forward to seeing the impact this has on improving LLM capabilities. #Innovation #Partnership #TechAdvancement
I've been waiting for a local model provider to be able to handle tool calling and am so happy it happened to be Ollama to do so. Couldn't be more excited to check this out, thanks for posting!!!
Not specifically talking about Langchain x Ollama, but tool calling technology has much more unprecedented potential than it is perceived. It will be as revolutionary as the LLM itself. Reveals so much opportunities...
Finally, we can test tool calling locally with Ollama…
Excellent news. Test tool calling with local model is what we've waiting for.
Will it work on smaller models,such as gemma:2b?
This is great 😀
So excited to use these new features!!!
This is a great advancement in tool calling! The integration of the Ollama package will undoubtedly enhance the capabilities of local LLMs. Excited to see how this evolves and simplifies tasks for developers. #ToolCalling #TechInnovation