Say: LLM Knowledgebase
To provide the user a way to ask questions that can be answered by a LLM
Last updated
To provide the user a way to ask questions that can be answered by a LLM
Last updated
This node is a special type of Say LLM. This variant will allow you to specify a prompt instruction towards the LLM and the specific knowledge base to use to get the content from.
Large language models are great at understanding user questions and provide sensible answers based on the information stored in a knowledge base. The LLM Knowlede base skill adds this flow node to your flow designer.
When you want to work with multiple knowledgebases in one smart app, you don't always want to use different Iterate: LLM Knowledgebase nodes because you would need to ask the user first a classifying question that helps the Smartapp to determine which knowledge base to use for its answer. Instead you could use the regular Iterate node and use one or more of this node as the branches, each pointing to a different knowledgebase.
The Say LLM Knowledgebase node can be used anywhere in your flow, but is mostly used as a branch of the Iterate node. The knowledge bases will have an implicit Topic Intent that you can link in the branch to trigger the right Say LLM Knowledge base node.
The topic intent is automatically generated by the platform when you provide a "description" to your knowledge base in the AI section.
The LLM can use a specific instruction from you (as the Smart App designer) to guide the generation of the answer. This instruction is given in the form of a "Prompt". The prompt can be configured in the AI section of the platform. Besides instructions, the prompt will also contain content that the LLM can use on top of its own dataset to generate answers, like information regarding your address, opening hours, parking instructions, etc. See the Prompts page for more information on this topic.
Prompt
Knowledge base
The which contains the instructions and content the LLM can use when generating a response
The that the LLM will use as the content for generating responses