Chat Input
Last updated
Last updated
Choose the appropriate Large Language Model (LLM) for your current conversation. For detailed settings, please refer to the Model Providers section.
If the chosen model supports file or image recognition, you can upload relevant files or images during the conversation.
Adjust the model's output randomness using the temperature setting. A higher value results in more varied responses. For more information, see the Large Language Model Guide.
Determine how much conversation history the model should retain. More history means the model remembers more context, but it will also use more context tokens.
Select which plugins to activate for the conversation. For further guidance, check the Plugin Usage section.
Monitor the conversation's context length and token consumption.
End the current conversation and begin a new topic. For additional details, refer to Topic Usage.
Submit your current input to the model. Additional options for sending can be found in the dropdown menu next to the send button.