This repo is an implementation of a locally hosted chatbot specifically focused on question answering over the LangChain documentation. Built with LangChain, and Next.js.
Deployed version: chatjs.langchain.com
Looking for the Python version? Click here
- Install dependencies via:
yarn install
. - Set the required environment variables listed inside
backend/.env.example
for the backend, andfrontend/.env.example
for the frontend.
- Build the backend via
yarn build --filter=backend
(from root). - Run the ingestion script by navigating into
./backend
and runningyarn ingest
.
- Navigate into
./frontend
and runyarn dev
to start the frontend. - Open localhost:3000 in your browser.
There are two components: ingestion and question-answering.
Ingestion has the following steps:
- Pull html from documentation site as well as the Github Codebase
- Load html with LangChain's RecursiveUrlLoader and SitemapLoader
- Split documents with LangChain's RecursiveCharacterTextSplitter
- Create a vectorstore of embeddings, using LangChain's Weaviate vectorstore wrapper (with OpenAI's embeddings).
Question-Answering has the following steps:
- Given the chat history and new user input, determine what a standalone question would be using GPT-3.5.
- Given that standalone question, look up relevant documents from the vectorstore.
- Pass the standalone question and relevant documents to the model to generate and stream the final answer.
- Generate a trace URL for the current chat session, as well as the endpoint to collect feedback.
Looking to use or modify this Use Case Accelerant for your own needs? We've added a few docs to aid with this:
- Concepts: A conceptual overview of the different components of Chat LangChain. Goes over features like ingestion, vector stores, query analysis, etc.
- Modify: A guide on how to modify Chat LangChain for your own needs. Covers the frontend, backend and everything in between.
- Running Locally: The steps to take to run Chat LangChain 100% locally.
- LangSmith: A guide on adding robustness to your application using LangSmith. Covers observability, evaluations, and feedback.
- Production: Documentation on preparing your application for production usage. Explains different security considerations, and more.
- Deployment: How to deploy your application to production. Covers setting up production databases, deploying the frontend, and more.