-
-
Notifications
You must be signed in to change notification settings - Fork 23.3k
Description
🐛 Structured Output Fails on LLM Node in AgentFlow v2 (Flowise 3.0.11)
When using AgentFlow v2 with an LLM node (Flowise v3.0.11), structured output only works intermittently depending on the LLM configuration.
Issue Summary
When the LLM node uses the ChatOpenAI provider with default parameters (i.e., no changes to reasoning, image uploads, or response format), streaming and reasoning disabled, the node throws the following error:
Error in LLM node: this.client.chat.completions.parse is not a function
Notes
This is based on the OpenAI migration guide from Chat Completions → Responses API:
https://platform.openai.com/docs/guides/migrate-to-responses
Both APIs support structured output.
Only the Responses API provides reasoning summaries and image generation.
Importantly, chat.completions.parse does not exist in the classic Chat Completions API.
It appears that, when reasoning is disabled, Flowise still routes structured-output requests through a Chat Completions client, causing the .parse() call to fail.
Workaround
If I enable reasoning, Flowise switches to the Responses API, and then:
Structured outputs work correctly.
No .parse() error occurs.
This suggests that ChatOpenAI + structured output triggers the wrong client pathway unless reasoning is explicitly turned on.
Reproduction Steps
- Create a new AgentFlow v2 flow.
- Add an LLM node.
- Select ChatOpenAI as the model provider.
- Leave all parameters untouched (reasoning disabled, image uploads disabled).
- Enable Structured Output.
- Run the flow.
Result: LLM node fails with the error above.
You can also see the problem if you try to follow this tutorial: https://www.youtube.com/watch?v=TbZaj5SZcbM&list=PL4HikwTaYE0FFKifDJcgnP23LjGAZjzL8&index=8
Expected Behavior
Structured output should work reliably using the default ChatOpenAI configuration without requiring any special flags.
Use Method
Docker
Flowise Version
3.0.11
Operating System
Linux
Browser
Chrome