Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion providers/ai21/jamba-large-1.7.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,10 @@ costs:
features:
- function_calling
- json_output
- system_messages
- tool_choice
limits:
context_window: 256000
max_input_tokens: 256000
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed max_input_tokens instead of keeping it alongside context_window

Medium Severity

max_input_tokens was removed when context_window was added, but every other model in the repository that declares context_window (all Anthropic models) also retains max_input_tokens. This makes jamba-large-1.7 the only model with context_window but without max_input_tokens, which breaks consumers relying on max_input_tokens to determine the input token limit for this model.

Fix in Cursor Fix in Web

Reviewed by Cursor Bugbot for commit 79ed952. Configure here.

max_output_tokens: 4096
max_tokens: 4096
modalities:
Expand Down
Loading