fix: _handle_async_stream_response() in OpenAIChatGenerator handles asyncio.CancelledError closing the response stream
#10175
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Related Issues
Proposed Changes:
The
_handle_async_stream_response()method inOpenAIChatGeneratordid not handleasyncio.CancelledErrorexceptions. When a streaming task was cancelled mid-stream, the async for loop would exit without properly closing the OpenAI stream object - leading to tokens still being sent and charged + potential resource leaks, open connections.Added exception handling in
_handle_async_stream_response()to catchasyncio.CancelledErrorand gracefully close the stream withasyncio.shield()to ensure the cleanup operation completes even during cancellation.How did you test it?
test_async_stream_closes_on_cancellation(): mocks an async stream, starts a streaming task, cancels it mid-stream, and verifies that stream.close() is called exactly once.test_run_async_cancellation_integration(): Tests against the real OpenAI API by starting a long-running streaming task (essay generation), cancelling it after 2 seconds, and verifying graceful cancellation with partial chunks received.Minor fix in
test_openai_responses.py: added missing OPENAI_API_KEY environment variable setup via monkeypatch.Notes for the reviewer
This PR also adds the same functionality for the following ChatGenerators (most are part of
haystack-core-integrationsonly the first is not):AzureOpenAIChatGeneratorNvidiaChatGeneratorAIMLAPIChatGeneratorMistralChatGeneratorSTACKITChatGeneratorOpenRouterChatGeneratorTogetherAIChatGeneratorsince they all inherit from the
OpenAIChatGenerator.Checklist
fix:,feat:,build:,chore:,ci:,docs:,style:,refactor:,perf:,test:and added!in case the PR includes breaking changes.