ci: only require packages from internal pypi if there are no 3.13 wheels upstream#115455
ci: only require packages from internal pypi if there are no 3.13 wheels upstream#115455joshuarli wants to merge 2 commits into
Conversation
There was a problem hiding this comment.
Cursor Bugbot has reviewed your changes and found 2 potential issues.
❌ Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.
Reviewed by Cursor Bugbot for commit 64fa016. Configure here.
| Package {package["name"]}=={version} in {filename} is sourced from internal | ||
| PyPI but upstream PyPI already has cp313 wheels for macOS arm64 and/or Linux |
There was a problem hiding this comment.
Bug: The error message uses "and/or" when describing platform requirements, but the code logic requires "and", potentially misleading developers.
Severity: LOW
Suggested Fix
Update the error message string at lines 82-84 to use "and" instead of "and/or". Similarly, correct the docstring for the _has_upstream_cp313_wheels function to accurately reflect that wheels for both macOS and Linux are required.
Prompt for AI Agent
Review the code at the location below. A potential bug has been identified by an AI
agent. Verify if this is a real issue. If it is, propose a fix; if not, explain why it's
not valid.
Location: tools/lint_requirements.py#L82-L83
Potential issue: The error message displayed by the linter at lines 82-84 is
inconsistent with the underlying code logic. The message states that a package can be
moved if it has wheels for "macOS arm64 and/or Linux x86_64", implying either platform
is sufficient. However, the function `_has_upstream_cp313_wheels` at line 39 requires
wheels for both platforms (`has_mac and has_linux`). This discrepancy can mislead
developers into believing a package is ready to be moved from the internal PyPI when it
does not yet meet the linter's actual requirements. The function's docstring also
contains this same inaccuracy.
Also affects:
tools/lint_requirements.py:39~39
| try: | ||
| with urllib.request.urlopen(url, timeout=5) as resp: | ||
| data = json.load(resp) | ||
| except Exception: |
There was a problem hiding this comment.
Bug: The PyPI API call in _has_upstream_cp313_wheels does not canonicalize package names, causing it to fail silently for packages with underscores.
Severity: MEDIUM
Suggested Fix
Before constructing the URL, normalize the package name according to PEP 503 rules. This can be done by replacing underscores, periods, and hyphens with a single hyphen and lowercasing the name. For example: canonical_name = re.sub(r"[-_.]+", "-", name).lower().
Prompt for AI Agent
Review the code at the location below. A potential bug has been identified by an AI
agent. Verify if this is a real issue. If it is, propose a fix; if not, explain why it's
not valid.
Location: tools/lint_requirements.py#L22
Potential issue: The function `_has_upstream_cp313_wheels` constructs a PyPI API URL
using the package `name` directly without normalization. According to PEP 503, PyPI
package names in URLs must be canonicalized, with underscores converted to hyphens. If a
package with an underscore in its name is processed, the constructed URL will be
incorrect, resulting in a 404 error. The code catches the resulting exception and
silently treats it as if the check passed, which means the linter will fail to flag
packages that should be flagged, undermining the purpose of the check for any such
packages added in the future.

internal pypi was created at a time to solve prebuilt wheels for tricky deps that build poorly on MacOS and was causing a lot of local dev issues
these issues are still relevant as a handful of deps don't have upstream wheels for plats we care about (linux amd64, macos arm64) - however this is way less than before
but we can relax the rules and only enforce internal pypi for deps that actually need it
this will make upgrading to python 3.14 easier/faster (as we don't have to rebuild much of the world) and help make python dep changes much more agile as we don't have to wait for publish -> pypi branch -> pypi main -> upload -> resync here
i've confirmed these new wheels are hash-equivalent (except cronsim for some reason - probably some timestamp issue during internal pypi rebuild - but the version appears to be safe)