Conversation
- Upgrade @ai-sdk/* packages from v1 to v3 - Remove ollama-ai-provider dependency, use OpenAI-compatible API instead - Add getOllamaBaseUrl and getOllamaOpenAIBaseUrl for URL normalization - Update streaming preview to handle text field and reasoning-delta chunks - Change maxTokens to maxOutputTokens for AI SDK v3 compatibility - Add thinking mode persistence (save/getThinkingModeEnabled) - Add tests for Ollama URL normalization - Upgrade TypeScript to v5.9.3 Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
🤖 Augment PR SummarySummary: Upgrades the browser extension’s AI integration to the newer AI SDK provider packages and improves Ollama support via an OpenAI-compatible endpoint. Changes:
Technical Notes: Streaming is processed via 🤖 Was this summary useful? React with 👍 or 👎 |
| ): StreamingPreviewState { | ||
| const includeReasoning = options.includeReasoning ?? true; | ||
| const textDelta = chunk.textDelta || ""; | ||
| const textDelta = chunk.text || chunk.textDelta || ""; |
There was a problem hiding this comment.
streamText().fullStream chunks are documented to use type: "text" (with text) for text deltas; applyStreamingPreviewChunk currently only treats "text-delta" as response text, so "text" chunks would be ignored and the preview may never show the actual answer.
Severity: high
🤖 Was this useful? React with 👍 or 👎, or 🚀 if it prevented an incident/outage.
Summary
Test plan
🤖 Generated with Claude Code