Azure Feeds’ Post

View profile for Azure Feeds, graphic

Keep up to date with the ever changing and evolving Microsoft Azure ecosystem.

SemanticKernel – 📎Chat Service demo running Llama2 LLM locally in Ubuntu. Hi! Today’s post is a demo on how to interact with a local LLM using Semantic Kernel. In my previous post, I wrote about how to use LM Studio to host a local server. Today we will use ollama in Ubuntu to host the LLM. Ollama Ollama is an open-source language model platform designed for local interaction with large language models (LLMs). It provides developers with a convenient way to run LLMs on their own machines, allowing experimentation, fine-tuning, and customization. With Ollama, you can create and execute scripts directly, without relying on external... #techcommunity #azure #microsoft https://lnkd.in/gWKXGz8i

  • No alternative text description for this image

To view or add a comment, sign in

Explore topics