Skip to content

fix: make model_info optional in ShowResponse for cloud model compatibility#648

Open
CyberRaccoonTeam wants to merge 1 commit intoollama:mainfrom
CyberRaccoonTeam:fix/show-response-model-info-validation
Open

fix: make model_info optional in ShowResponse for cloud model compatibility#648
CyberRaccoonTeam wants to merge 1 commit intoollama:mainfrom
CyberRaccoonTeam:fix/show-response-model-info-validation

Conversation

@CyberRaccoonTeam
Copy link
Copy Markdown

Summary

ShowResponse raises a ValidationError when model_info is omitted by cloud model responses (e.g., when using OpenAI-compatible endpoints through Ollama).

Problem

# This fails with ValidationError
response = client.show("gemma3:4b")
# ValidationError: Key "model_info" not found

Cloud models and some local models do not return model_info in their response, causing a validation error instead of gracefully handling the missing field.

Solution

Make model_info optional with a default of None in the ShowResponse model.

Testing

  • ✅ Tested with local models that return model_info
  • ✅ Tested with cloud models that omit model_info
  • ✅ Existing tests pass

Fixes the issue where client.show() crashes on cloud model responses.

…bility

Pydantic requires explicit default=None for Optional fields with aliases.
Without it, ShowResponse raises ValidationError when /api/show omits
model_info, which happens with some cloud models.

Fixes ollama#607
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant