Say: LLM Knowledgebase
To provide the user a way to ask questions that can be answered by an LLM
This node is a special type of Say: LLM. This variant will allow you to specify a prompt instruction to the LLM and the specific knowledge base to use to get the content from.
Large language models (LLMs) are great at understanding user questions and provide sensible answers based on the information stored in a knowledge base. The LLM Knowledgebase skill adds this Flow node to your Flow designer.
When to use?
When you want to work with multiple knowledge bases in one Smart App, you don't always want to use different Iterate: LLM Knowledgebase nodes. If you did, you would need to ask the user a classifying question to help the Smart App determine which knowledge base to use for its answer. Instead, you could use an Iterate: Iterate node as the "menu" and point to one or more Say: LLM Knowledgebase nodes as the branches, each pointing to a different knowledge base.
How to use?
The Say: LLM Knowledgebase node can be used anywhere in your Flow, but is most often used as a branch of the Iterate node. The knowledge bases will have an implicit Topic Intent that you can link in the branch to trigger the right Say: LLM Knowledgebase node.

The LLM can use a specific instruction from you (as the Smart App designer) to guide the generation of the answer. This instruction is given in the form of a "Prompt". The prompt can be configured in the AI section of the platform. Besides instructions, the prompt will also contain content that the LLM can use on top of its own dataset to generate answers, like information regarding your address, opening hours, parking instructions, etc. See the Prompts page for more information on this topic.
Node properties
Prompt
The prompt which contains the instructions and content the LLM can use when generating a response
Resources
Last updated
Was this helpful?