-
-
Notifications
You must be signed in to change notification settings - Fork 23.7k
Open
Labels
bugSomething isn't workingSomething isn't working
Description
Describe the bug
- Some of the ChatOpenAI models, like GPT 5.1, setting the Max Tokens will always return error
- Using older models, like GPT 4o mini with Max Tokens, will not return error
To Reproduce
- Create a new Flow
- Create LLM/Agent node
- Connect to ChatOpenAI with proper credentials
- Important: Set the Max Tokens parameter in ChatOpenAI Parameters
- Test the Flow (Open Chat)
- It will return error
Error in LLM node: 400 Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.
Expected behavior
GPT 5,1 should work fine with Max Tokens configured
Screenshots
Flow
No response
Use Method
Docker
Flowise Version
3.0.12
Operating System
Linux
Browser
Chrome
Additional context
No response
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working