Skip to content

Cannot use Max Tokens for GPT 5.1 #5804

@daffaalex22

Description

@daffaalex22

Describe the bug

  • Some of the ChatOpenAI models, like GPT 5.1, setting the Max Tokens will always return error
  • Using older models, like GPT 4o mini with Max Tokens, will not return error

To Reproduce

  1. Create a new Flow
  2. Create LLM/Agent node
  3. Connect to ChatOpenAI with proper credentials
  4. Important: Set the Max Tokens parameter in ChatOpenAI Parameters
  5. Test the Flow (Open Chat)
  6. It will return error
Error in LLM node: 400 Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.

Expected behavior

GPT 5,1 should work fine with Max Tokens configured

Screenshots

Image Image

Flow

No response

Use Method

Docker

Flowise Version

3.0.12

Operating System

Linux

Browser

Chrome

Additional context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions