Skip to content
This repository was archived by the owner on Aug 22, 2025. It is now read-only.
This repository was archived by the owner on Aug 22, 2025. It is now read-only.

Default print handler only prints tool calls when LLM uses streaming #407

@tarmst

Description

@tarmst

Description

If the LLM is set to not use streaming by setting a different model, the default print handler will print output text, but not tool calls. I have confirmed that the tool calls are being output by the model.

Example Code

from langchain_openai import ChatOpenAI
import controlflow as cf

cf.settings.tools_verbose = True
cf.settings.default_print_handler_show_completion_tool_results = True
cf.settings.enable_default_print_handler = True

model = ChatOpenAI(
    model_name="model-name",
    base_url="base_url",
    api_key="none",
    disable_streaming=True,
    temperature=0.01,
)

cf.defaults.model = model

Version Information

ControlFlow version: 0.12.1                                      
       Prefect version: 3.2.2                                       
LangChain Core version: 0.3.34                                      
        Python version: 3.10.11                                     
              Platform: Linux-6.2.0-39-generic-x86_64-with-glibc2.31

LLM provider: tried both vLLM and Ollama.

Additional Context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions