Iterate: LLM
To provide the user a way to ask questions that can be answered by a LLM
Last updated
To provide the user a way to ask questions that can be answered by a LLM
Last updated
The Iterate: LLM node is used in the same way the Iterate: default node is used. This means, it is most often used as a form of main menu of your Smart App: "What can I do for you today?".
Large language models are great at understanding user questions and provide sensible answers based on the information stored in the common knowledge of the language model. The LLM Flows skill adds this flow node to your flow designer.
The difference with the default Iterate node is that it will use the LLM to provide an answer to the user when none of the links are triggered. When the LLM doesn't have an answer, it will still trigger the __unknown__
dialog trigger which you can configure on this node to handle this event.
As explained in the introduction, the Ask: LLM node is used in the same way the default Iterate node is used. The Iterate: LLM will work as a menu. It either responds due to one of the labelled links or dialog triggers are triggered or otherwise respond from the LLM.
In the above example, the Iterate: LLM will match the intents associated with the two labelled links "Talk to a human" and "Done". When the user doesn't appear to be asking for one of these intents, it will use the LLM to generate an answer.
The LLM can use a specific instruction from you (as the Smart App designer) to guide the generation of the answer. This instruction is given in the form of a "Prompt". The prompt can be configured in the AI section of the platform. Besides instructions, the prompt will also contain content that the LLM can use on top of its own dataset to generate answers, like information regarding your address, opening hours, parking instructions, etc. See the Prompts page for more information on this topic.
Question
The initial question
Repeat (1)
The text to use when re-prompting the user after a turn
optional
Repeat (2)
The next text to use when re-prompting the user after a turn. The system will keep on iterating this step the third and all other turns
Prompt
The prompt which contains the instructions and content the LLM can use when generating a response