From the course: Building Secure and Trustworthy LLMs Using NVIDIA Guardrails
Unlock the full course today
Join today to access over 24,000 courses taught by industry experts.
Building a custom inquiry action
From the course: Building Secure and Trustworthy LLMs Using NVIDIA Guardrails
Building a custom inquiry action
- [Instructor] Let's take a look at an exciting small project where we build a weather inquiry action for our LLM. This is a great example, because it not only helps you with your technical skills, but it'll show you the power of Colang in building real-time LLM conversations where we integrate real data. So let's dive right in. First, I'll set up my environment by installing the necessary packages, which are nemoguardrails and openai. I already have these installed, but you can execute the pip install command to install these packages. Once this is done, we configure our OPENAI_API_KEY. This is essential for authenticating our request to the OpenAI service. Make sure you replace the OPENAI_API_KEY here with your API key. Now we come to the main part. As we've seen before, Every Colang chat bot has two components. One is the Colang content and the other is the YAML content. The weather inquiry action needs to be defined using a flow. So here if you look at the flow, we have the flow…