Flow Components

LLMs

OpenAI ChatGPT

This component can use input information and, based on the component configuration, call the ChatGPT service provided by OpenAI.

AI Model

  • Model Version: The version of the LLM model.

  • Creativity: Determines whether the LLM output is more strict or creative. A higher value indicates greater creativity but may be less strict.

  • Token Settings: Allocates weights for various extension features within the limited context of LLM. You can enable only the required extension features to save tokens. The enabled extension features will be represented as active on this component, from left to right: Long-Term Memory, Short-Term Memory, Identity Prompts, Knowledge Base, Plugins.

Prompt Function Configuration

In this area, you can not only adjust the order of extension features by dragging and dropping but also input text between extension feature blocks. The extension features and input text will be combined to form the final prompt.

The following extension features require special settings:

  • Identity Prompts: You can enter the identity information that the current LLM should assume in this section, such as roles, tasks, emotions, etc., to help LLM better understand what it needs to do.

  • Plugins: You can add plugins to be called in this section to allow LLM to perform specific tasks using the plugins.

Please note that the order of extension features is crucial as it directly affects LLM's understanding of the prompt and the output results.

Knowledge Vector

Knoeledge Vector

This component will use input information to retrieve knowledge from a knowledge base and output knowledge recall results.

  • Doc Correlation: The minimum relevance of the recalled knowledge after retrieving it from the knowledge scope.

  • Matching knowledge: The maximum number of knowledge to be recalled after retrieving it from the knowledge scope.

  • Knowledge Vector Scope: A subset of knowledge documents can be selected from the Bot's knowledge base as the retrieval scope for the current "knowledge vector" component.

Flow Rules

Branch Predicate

This component will utilize input information and be determined by LLM to which branch it belongs, then proceed with the output.

AI Model

  • Model Version: The version of LLM's model that will perform the branch determination.

  • Creativity: Determines whether the LLM output is precise or creative. A higher value indicates stronger creativity, but may sacrifice precision to some extent.

  • Token Settings: Sets the allocation weights for various extended functionalities within LLM's limited context. You can select only the required extended functionalities to save tokens.

Branch Predicate

  • Custom Branches: Define multiple custom branches, write judgment conditions for each branch, and let LLM determine which branch the input should enter.

  • Other: When the input condition is determined to not belong to any custom branches, this branch will be used as the output.

Prompt Configuration

You can adjust the order of different extended functionalities within the prompt, enter text prompts to supplement necessary information, and form the final prompt.

Boolean Predicate

This component will be based on input information and evaluated by LLM to determine a boolean value (true/false) for output.

AI Model

  • Model Version: The version of LLM's model that will perform the branch determination.

  • Creativity: Determines whether the LLM output is precise or creative. A higher value indicates stronger creativity, but may sacrifice precision to some extent.

  • Token Settings: Sets the allocation weights for various extended functionalities within LLM's limited context. You can select only the required extended functionalities to save tokens.

Boolean Predicate

  • True: If LLM determines the input to be true, this branch will be executed.

  • False: If LLM determines the input to be false, this branch will be executed.

  • Other: When the input condition is determined to be neither true nor false, this branch will serve as the output.

Prompt Configuration

You can adjust the order of different extended functionalities within the prompt, enter text prompts to supplement necessary information, and form the final prompt.

Preset Response

Preset Response

Regardless of the input content, this component will no longer perform any execution or evaluation and will directly output the preset content.

  • Data Type: Currently only supports "text" type.

  • Preset Content: Defines the content to be output.

Last updated