Fix #4550: Update tokenizers dependency to >=0.21 to fix broken sdist#4551
Open
devin-ai-integration[bot] wants to merge 2 commits intomainfrom
Open
Fix #4550: Update tokenizers dependency to >=0.21 to fix broken sdist#4551devin-ai-integration[bot] wants to merge 2 commits intomainfrom
devin-ai-integration[bot] wants to merge 2 commits intomainfrom
Conversation
#4550) tokenizers 0.20.x has a broken pyproject.toml (missing project.version), which causes installation failures when building from source (sdist), particularly on Windows with uv. Update the constraint from ~=0.20.3 to >=0.21,<1 to avoid the broken versions. Add regression tests to ensure the constraint stays correct. Co-Authored-By: João <joao@crewai.com>
Contributor
Author
|
Prompt hidden (unlisted session) |
Contributor
Author
🤖 Devin AI EngineerI'll be helping with this pull request! Here's what you should know: ✅ I will automatically:
Note: I can only respond to comments from users who have write access to this repository. ⚙️ Control Options:
|
Co-Authored-By: João <joao@crewai.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Fix #4550: Update tokenizers dependency to >=0.21 to fix broken sdist
Summary
tokenizers==0.20.3has a brokenpyproject.toml(missingproject.versionfield), causing installation failures whenuvneeds to build from source (sdist). This changes the constraint from~=0.20.3(locked to 0.20.x) to>=0.21,<1, which resolves totokenizers==0.22.2in the lock file.tokenizersis not directly imported anywhere in the crewai source — it's a transitive dependency (used by chromadb and others).Added 4 regression tests to guard against re-introducing the broken version range.
Review & Testing Checklist for Human
uv.lockregeneration didn't introduce unintended dependency bumps. The lock file was fully regenerated (not surgically patched), so many packages beyond tokenizers changed versions (e.g.,a2a-sdk0.3.22→0.3.24, numpy resolution splits by Python version). Skim the lock diff for anything concerning — the diff is very large.>=0.21,<1is too broad a range. The old constraint was very tight (~=0.20.3). A narrower range like~=0.22.0might be more appropriate for this project's pinning style. Consider whether chromadb or other transitive consumers of tokenizers have compatibility concerns with future tokenizers releases.Notes