Adding knowledge base to an LLM

8/14/2025

How do I create a LLM that queries a knowledge base workflow?

St

I'll guide you through creating an LLM workflow that queries a knowledge base in Stack AI. Based on the platform's features, here's a step-by-step approach:

Step 1: Create a New Project

  1. Click the "New Project" button in the top right corner of your Stack AI dashboard New Project Button

  2. Select "Workflow Builder" and click "Create"

  3. Choose the "Knowledge Base Agent" template or start with a "New Project" template

Step 2: Set Up Your Knowledge Base

  1. First, create a knowledge base by clicking "Knowledge Bases" in the sidebar Knowledge Bases

  2. Click "Create a Knowledge Base"

  3. Enter a name for your knowledge base and click "Create"

  4. Upload your documents or connect to external sources like:

    • SharePoint
    • Google Drive
    • Dropbox
    • Confluence
    • Gmail (as shown in recent updates)

Step 3: Build Your Workflow

In the Workflow Builder canvas:

Add an Input Node

  1. Drag an Input node from the sidebar to capture user queries

Add a Knowledge Base Node

  1. From the sidebar under "Knowledge Bases," drag your created knowledge base into the canvas
  2. Connect it to receive the user query input
  3. This will perform semantic search on your documents

Add an LLM Node

  1. Drag an LLM node from the sidebar
  2. In the LLM configuration:
    • Reference the knowledge base results in your prompt using {knowledge-base-node-name}
    • Write a prompt like: "Based on the following context: {knowledge-base-0}, answer the user's question: {input-0}"

Add an Output Node

  1. Connect an Output node to display the LLM's response

Step 4: Configure Advanced Features (Optional)

For Better Accuracy:

  • Enable metadata filtering in your knowledge base settings for improved retrieval accuracy
  • Use Dynamic Vector Store nodes for in-memory document embeddings if needed

For Conversation Memory:

  • Add a Shared Memory node to maintain context across multiple interactions

For Complex Logic:

  • Use If/Else nodes or AI Routing nodes for conditional processing

Step 5: Test and Deploy

  1. Test your workflow using the chat interface
  2. Check the Analytics view to monitor performance
  3. Use the Export view to deploy your agent as a chatbot or API

Pro Tips:

  • Reference knowledge bases in LLM prompts using /
  • Use @ to reference tools in prompts
  • Enable bounding boxes for PDF citations to improve accuracy
  • Consider using the Prompt Library to store and reuse effective prompts Prompt Library

This approach creates a Retrieval-Augmented Generation (RAG) system where your LLM can query and reason over your knowledge base content to provide accurate, contextual responses.