From 21f3d55312fda0787e39f463fa29325937b1205f Mon Sep 17 00:00:00 2001 From: Steve Loeppky Date: Tue, 31 Mar 2026 20:35:11 -0700 Subject: [PATCH 1/3] feat(foc-pr-report): add Project 14 PR workload report Add foc-pr-report uv package: REST board q filter, Markdown table, reviewer enrichment via PR reviews API. Add foc_project14_client for shared Project 14 GraphQL and REST helpers. Refactor foc_wg_pr_notifier to use shared fetch and field_values_by_name. Document token scope, fetch behavior, and reviewer vs requested-review semantics. Made-with: Cursor --- README.md | 13 + foc-pr-report/.gitignore | 2 + foc-pr-report/README.md | 71 ++++ foc-pr-report/foc_pr_report/__init__.py | 1 + foc-pr-report/foc_pr_report/cli.py | 81 ++++ foc-pr-report/foc_pr_report/report.py | 116 ++++++ foc-pr-report/pyproject.toml | 14 + foc-pr-report/uv.lock | 161 ++++++++ foc_project14_client.py | 522 ++++++++++++++++++++++++ foc_wg_pr_notifier.py | 158 +------ 10 files changed, 984 insertions(+), 155 deletions(-) create mode 100644 foc-pr-report/.gitignore create mode 100644 foc-pr-report/README.md create mode 100644 foc-pr-report/foc_pr_report/__init__.py create mode 100644 foc-pr-report/foc_pr_report/cli.py create mode 100644 foc-pr-report/foc_pr_report/report.py create mode 100644 foc-pr-report/pyproject.toml create mode 100644 foc-pr-report/uv.lock create mode 100644 foc_project14_client.py diff --git a/README.md b/README.md index 525f035..a258159 100644 --- a/README.md +++ b/README.md @@ -65,6 +65,19 @@ Automated daily notification system that fetches open PRs from [FilOzone GitHub ***Periodic Runs:*** This notifier is scheduled to run periodically per [./github/workflows/fog-wg-pr-notifier.yml](fog-wg-pr-notifier.yml). +GraphQL fetch logic for Project 14 lives in [`foc_project14_client.py`](foc_project14_client.py) (shared with the FOC PR report). + +### 📋 FOC PR report (Markdown) +**Directory:** [foc-pr-report/](foc-pr-report/) + +Generates a Markdown table of PR workload on [Project 14 / View 2](https://github.com/orgs/FilOzone/projects/14/views/2): counts per GitHub user and board status (excludes Done and Todo), with deep links for each cell. Run with **`uv`**: + +```bash +cd foc-pr-report && uv sync && GITHUB_TOKEN=your_token uv run foc-pr-report -o report.md +``` + +See [foc-pr-report/README.md](foc-pr-report/README.md). + ### 🎯 GitHub Milestone Manager **Directory:** [github-milestone-creator/](github-milestone-creator/) diff --git a/foc-pr-report/.gitignore b/foc-pr-report/.gitignore new file mode 100644 index 0000000..a230a78 --- /dev/null +++ b/foc-pr-report/.gitignore @@ -0,0 +1,2 @@ +.venv/ +__pycache__/ diff --git a/foc-pr-report/README.md b/foc-pr-report/README.md new file mode 100644 index 0000000..2e12624 --- /dev/null +++ b/foc-pr-report/README.md @@ -0,0 +1,71 @@ +# FOC PR report (Project 14) + +Generates a Markdown table of pull requests on [FilOzone Project 14](https://github.com/orgs/FilOzone/projects/14), grouped by GitHub user and board **Status**. PRs in **Done** or **Todo** are excluded. All PR states (open/closed/draft) are included if they remain on the board with a qualifying status. + +Requested **team** reviewers are not attributed to user rows (same as other tpm-utils tooling). + +## Token + +You need credentials that can read FilOzone org data and **GitHub Projects (classic + v2)**. The GraphQL API requires the **`read:project`** scope. (`repo` and `read:org` alone are not enough.) + +### GitHub CLI (recommended) + +1. Ensure Project scope is on your login (default host `github.com` must be an account that can see FilOzone Project 14): + +```bash +gh auth refresh -s read:project +``` + +2. Pass the token to this tool: + +```bash +GITHUB_TOKEN=$(gh auth token) uv run foc-pr-report -o foc-report.md +``` + +Or export it for the shell session: + +```bash +export GITHUB_TOKEN=$(gh auth token) +uv run foc-pr-report +``` + +If you see an **INSUFFICIENT_SCOPES** / **read:project** error, run `gh auth refresh -s read:project` (or `gh auth login` and include project read when asked). For org-owned projects, accept **organization** access for FilOzone if prompted. + +### Personal access token + +Alternatively set `GITHUB_TOKEN` to a classic PAT from [GitHub → Settings → Developer settings → Tokens](https://github.com/settings/tokens) and enable **`read:project`** (plus **`read:org`** and repo read as needed). + +## Run with uv + +```bash +cd foc-pr-report +uv sync +GITHUB_TOKEN=$(gh auth token) uv run foc-pr-report -o foc-report.md +``` + +Or print to stdout: + +```bash +GITHUB_TOKEN=$(gh auth token) uv run foc-pr-report +``` + +## How fetching works + +The tool uses the **REST** [List project items](https://docs.github.com/en/rest/projects/items#list-items-for-an-organization-owned-project) endpoint with the same **`q`** filter string as the board (for example `is:pr` plus `-status:"…"`). GitHub applies that filter **server-side**, so only matching cards are returned and paginated—unlike listing the entire project via GraphQL when the board is very large. + +### Reviewer column vs GitHub APIs + +The **project table’s “Reviewers” column** in the GitHub UI is a single place to show review-related people, but the data comes from more than one backend concept: + +| Concept | What it is | Typical API | +|--------|------------|-------------| +| **Requested reviewers** | Users (or teams) currently asked to review **this PR**; this is what powers “awaiting review” for those names. | On each card, the REST payload exposes this as `requested_reviewers` (often the same as [`requested_reviewers` on the pull request](https://docs.github.com/en/rest/pulls/pulls#get-a-pull-request)). | +| **Submitted pull request reviews** | A formal review action: **Comment**, **Approve**, or **Request changes** (and related states), as listed under [List reviews for a pull request](https://docs.github.com/en/rest/pulls/reviews#list-reviews-for-a-pull-request). A **COMMENTED** submission is still a review, not the same as a random **issue comment** on the conversation tab. | `GET /repos/{owner}/{repo}/pulls/{pull_number}/reviews` | + +Those two can diverge. For example, you might **submit a review** (so the UI can list you under **Reviewers**), while newer requests go to other people—so you are **no longer** in `requested_reviewers`, but you still appear on the board and match filters like a bare `username`. + +**What this tool does:** For each PR in the report, the **reviewer** counts use the **union** of (1) `requested_reviewers` on the card and (2) human users who have submitted at least one non-`PENDING` pull request review (same review list as above), **excluding the PR author**. That is meant to stay close to how filtering by person on the board behaves and what you see in the **Reviewers** column—not “everyone who left an issue comment.” + +## Links + +Each column links to [View 2](https://github.com/orgs/FilOzone/projects/14/views/2) with a `filterQuery` matching that cell (base filter plus user, status, assignee, or reviewers as appropriate). diff --git a/foc-pr-report/foc_pr_report/__init__.py b/foc-pr-report/foc_pr_report/__init__.py new file mode 100644 index 0000000..7dd07f9 --- /dev/null +++ b/foc-pr-report/foc_pr_report/__init__.py @@ -0,0 +1 @@ +"""FOC project 14 PR markdown report.""" diff --git a/foc-pr-report/foc_pr_report/cli.py b/foc-pr-report/foc_pr_report/cli.py new file mode 100644 index 0000000..529b5ed --- /dev/null +++ b/foc-pr-report/foc_pr_report/cli.py @@ -0,0 +1,81 @@ +#!/usr/bin/env python3 +"""CLI: fetch FilOzone Project 14 and print Markdown workload table.""" + +from __future__ import annotations + +import argparse +import os +import sys +from pathlib import Path + +import requests + +# Repo layout: tpm-utils/foc-pr-report/foc_pr_report/cli.py → parents[2] == tpm-utils +_TPM_UTILS_ROOT = Path(__file__).resolve().parents[2] +if str(_TPM_UTILS_ROOT) not in sys.path: + sys.path.insert(0, str(_TPM_UTILS_ROOT)) + +from foc_project14_client import ( # noqa: E402 + enrich_pull_items_with_submitted_reviewers, + fetch_project_board_items_rest_filtered, +) + +from foc_pr_report.report import BASE_FILTER, aggregate_rows, render_markdown # noqa: E402 + + +def main() -> None: + parser = argparse.ArgumentParser( + description="Generate Markdown table of FOC (Project 14) PR counts by person and status", + ) + parser.add_argument( + "--token", + help="GitHub token (or set GITHUB_TOKEN)", + ) + parser.add_argument( + "-o", + "--output", + help="Write Markdown to this file (default: stdout)", + ) + parser.add_argument( + "-q", + "--quiet", + action="store_true", + help="Less progress output from the GitHub client", + ) + args = parser.parse_args() + + token = args.token or os.environ.get("GITHUB_TOKEN") + if not token: + print("Error: set GITHUB_TOKEN or pass --token", file=sys.stderr) + sys.exit(1) + + session = requests.Session() + session.headers.update( + { + "Authorization": f"Bearer {token}", + "Content-Type": "application/json", + } + ) + + items = fetch_project_board_items_rest_filtered( + session, + filter_query=BASE_FILTER, + verbose=not args.quiet, + ) + enrich_pull_items_with_submitted_reviewers( + session, + items, + verbose=not args.quiet, + ) + md = render_markdown(aggregate_rows(items)) + + if args.output: + Path(args.output).write_text(md, encoding="utf-8") + if not args.quiet: + print(f"Wrote {args.output}") + else: + sys.stdout.write(md) + + +if __name__ == "__main__": + main() diff --git a/foc-pr-report/foc_pr_report/report.py b/foc-pr-report/foc_pr_report/report.py new file mode 100644 index 0000000..e494729 --- /dev/null +++ b/foc-pr-report/foc_pr_report/report.py @@ -0,0 +1,116 @@ +"""Aggregate Project 14 PRs and render Markdown.""" + +from __future__ import annotations + +from collections import defaultdict +from typing import Any, Dict, List, Tuple +from urllib.parse import quote + +from foc_project14_client import field_values_by_name + +# Base board filter (View 2) — keep in sync with ManuallyApplied FOC table links +BASE_FILTER = 'is:pr -status:"🎉 Done" -status:"🐱 Todo"' + +STATUS_DONE = "🎉 Done" +STATUS_TODO = "🐱 Todo" + +PROJECT_VIEW_2 = "https://github.com/orgs/FilOzone/projects/14/views/2" + +# Primary lanes first; other statuses sort after these, alphabetically +STATUS_ORDER = [ + "⌨️ In Progress", + "🔎 Awaiting review", + "✔️ Approved by reviewer", +] + +Row = Tuple[str, str, int, int] + + +def _filter_body(*parts: str) -> str: + return " ".join(parts) + + +def view2_url(filter_body: str) -> str: + return f"{PROJECT_VIEW_2}?filterQuery={quote(filter_body)}" + + +def _status_sort_key(status: str) -> Tuple[int, str]: + try: + return (0, f"{STATUS_ORDER.index(status):04d}") + except ValueError: + return (1, status.lower()) + + +def aggregate_rows(items: List[Dict[str, Any]]) -> List[Row]: + """Return (login, status, assignee_count, reviewer_count) for each user/status lane.""" + assignee_counts: Dict[Tuple[str, str], int] = defaultdict(int) + reviewer_counts: Dict[Tuple[str, str], int] = defaultdict(int) + + for item in items: + content = item.get("content") + if not content or content.get("__typename") != "PullRequest": + continue + + field_values = field_values_by_name(item) + status = field_values.get("Status") + if not status or status in (STATUS_DONE, STATUS_TODO): + continue + + assignee_nodes = content.get("assignees", {}).get("nodes", []) + assignee_logins = [ + n.get("login") + for n in assignee_nodes + if n and n.get("login") + ] + + # Reviewers: requested on the PR plus users who submitted a PR review (COMMENTED/APPROVED/…). + # See README — board UI can show the latter even when they're no longer in requested_reviewers. + reviewer_logins: set[str] = set() + for rr in content.get("reviewRequests", {}).get("nodes", []): + rev = rr.get("requestedReviewer") or {} + login = rev.get("login") + if login: + reviewer_logins.add(login) + for login in content.get("_submitted_reviewer_logins") or []: + reviewer_logins.add(login) + + for login in assignee_logins: + assignee_counts[(login, status)] += 1 + for login in reviewer_logins: + reviewer_counts[(login, status)] += 1 + + keys = set(assignee_counts) | set(reviewer_counts) + rows: List[Row] = [] + for login, status in keys: + a = assignee_counts.get((login, status), 0) + r = reviewer_counts.get((login, status), 0) + if a == 0 and r == 0: + continue + rows.append((login, status, a, r)) + + rows.sort(key=lambda t: (t[0].lower(), _status_sort_key(t[1]), t[1])) + return rows + + +def render_markdown(rows: List[Row]) -> str: + """GFM table with linked username, state label, assignee count, reviewer count.""" + lines = [ + "| github username | state | assignee | reviewer |", + "| --- | --- | --- | --- |", + ] + + for login, status, a_count, r_count in rows: + user_q = _filter_body(BASE_FILTER, login) + state_q = _filter_body(BASE_FILTER, f'status:"{status}"', login) + assign_q = _filter_body(BASE_FILTER, f'status:"{status}"', f"assignee:{login}") + rev_q = _filter_body(BASE_FILTER, f'status:"{status}"', f"reviewers:{login}") + + lines.append( + "| " + + f"[{login}]({view2_url(user_q)}) | " + + f"[{status}]({view2_url(state_q)}) | " + + f"[{a_count}]({view2_url(assign_q)}) | " + + f"[{r_count}]({view2_url(rev_q)}) |" + ) + + return "\n".join(lines) + "\n" diff --git a/foc-pr-report/pyproject.toml b/foc-pr-report/pyproject.toml new file mode 100644 index 0000000..d98391e --- /dev/null +++ b/foc-pr-report/pyproject.toml @@ -0,0 +1,14 @@ +[project] +name = "foc-pr-report" +version = "0.1.0" +description = "Markdown report for FilOzone GitHub Project 14 (FOC) PR workload by person and status" +readme = "README.md" +requires-python = ">=3.10" +dependencies = ["requests>=2.31"] + +[project.scripts] +foc-pr-report = "foc_pr_report.cli:main" + +[build-system] +requires = ["hatchling>=1.24.2"] +build-backend = "hatchling.build" diff --git a/foc-pr-report/uv.lock b/foc-pr-report/uv.lock new file mode 100644 index 0000000..0b3644b --- /dev/null +++ b/foc-pr-report/uv.lock @@ -0,0 +1,161 @@ +version = 1 +revision = 1 +requires-python = ">=3.10" + +[[package]] +name = "certifi" +version = "2026.2.25" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/af/2d/7bf41579a8986e348fa033a31cdd0e4121114f6bce2457e8876010b092dd/certifi-2026.2.25.tar.gz", hash = "sha256:e887ab5cee78ea814d3472169153c2d12cd43b14bd03329a39a9c6e2e80bfba7", size = 155029 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/9a/3c/c17fb3ca2d9c3acff52e30b309f538586f9f5b9c9cf454f3845fc9af4881/certifi-2026.2.25-py3-none-any.whl", hash = "sha256:027692e4402ad994f1c42e52a4997a9763c646b73e4096e4d5d6db8af1d6f0fa", size = 153684 }, +] + +[[package]] +name = "charset-normalizer" +version = "3.4.6" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/7b/60/e3bec1881450851b087e301bedc3daa9377a4d45f1c26aa90b0b235e38aa/charset_normalizer-3.4.6.tar.gz", hash = "sha256:1ae6b62897110aa7c79ea2f5dd38d1abca6db663687c0b1ad9aed6f6bae3d9d6", size = 143363 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e6/8c/2c56124c6dc53a774d435f985b5973bc592f42d437be58c0c92d65ae7296/charset_normalizer-3.4.6-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:2e1d8ca8611099001949d1cdfaefc510cf0f212484fe7c565f735b68c78c3c95", size = 298751 }, + { url = "https://files.pythonhosted.org/packages/86/2a/2a7db6b314b966a3bcad8c731c0719c60b931b931de7ae9f34b2839289ee/charset_normalizer-3.4.6-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e25369dc110d58ddf29b949377a93e0716d72a24f62bad72b2b39f155949c1fd", size = 200027 }, + { url = "https://files.pythonhosted.org/packages/68/f2/0fe775c74ae25e2a3b07b01538fc162737b3e3f795bada3bc26f4d4d495c/charset_normalizer-3.4.6-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:259695e2ccc253feb2a016303543d691825e920917e31f894ca1a687982b1de4", size = 220741 }, + { url = "https://files.pythonhosted.org/packages/10/98/8085596e41f00b27dd6aa1e68413d1ddda7e605f34dd546833c61fddd709/charset_normalizer-3.4.6-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:dda86aba335c902b6149a02a55b38e96287157e609200811837678214ba2b1db", size = 215802 }, + { url = "https://files.pythonhosted.org/packages/fd/ce/865e4e09b041bad659d682bbd98b47fb490b8e124f9398c9448065f64fee/charset_normalizer-3.4.6-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:51fb3c322c81d20567019778cb5a4a6f2dc1c200b886bc0d636238e364848c89", size = 207908 }, + { url = "https://files.pythonhosted.org/packages/a8/54/8c757f1f7349262898c2f169e0d562b39dcb977503f18fdf0814e923db78/charset_normalizer-3.4.6-cp310-cp310-manylinux_2_31_armv7l.whl", hash = "sha256:4482481cb0572180b6fd976a4d5c72a30263e98564da68b86ec91f0fe35e8565", size = 194357 }, + { url = "https://files.pythonhosted.org/packages/6f/29/e88f2fac9218907fc7a70722b393d1bbe8334c61fe9c46640dba349b6e66/charset_normalizer-3.4.6-cp310-cp310-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:39f5068d35621da2881271e5c3205125cc456f54e9030d3f723288c873a71bf9", size = 205610 }, + { url = "https://files.pythonhosted.org/packages/4c/c5/21d7bb0cb415287178450171d130bed9d664211fdd59731ed2c34267b07d/charset_normalizer-3.4.6-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:8bea55c4eef25b0b19a0337dc4e3f9a15b00d569c77211fa8cde38684f234fb7", size = 203512 }, + { url = "https://files.pythonhosted.org/packages/a4/be/ce52f3c7fdb35cc987ad38a53ebcef52eec498f4fb6c66ecfe62cfe57ba2/charset_normalizer-3.4.6-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:f0cdaecd4c953bfae0b6bb64910aaaca5a424ad9c72d85cb88417bb9814f7550", size = 195398 }, + { url = "https://files.pythonhosted.org/packages/81/a0/3ab5dd39d4859a3555e5dadfc8a9fa7f8352f8c183d1a65c90264517da0e/charset_normalizer-3.4.6-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:150b8ce8e830eb7ccb029ec9ca36022f756986aaaa7956aad6d9ec90089338c0", size = 221772 }, + { url = "https://files.pythonhosted.org/packages/04/6e/6a4e41a97ba6b2fa87f849c41e4d229449a586be85053c4d90135fe82d26/charset_normalizer-3.4.6-cp310-cp310-musllinux_1_2_riscv64.whl", hash = "sha256:e68c14b04827dd76dcbd1aeea9e604e3e4b78322d8faf2f8132c7138efa340a8", size = 205759 }, + { url = "https://files.pythonhosted.org/packages/db/3b/34a712a5ee64a6957bf355b01dc17b12de457638d436fdb05d01e463cd1c/charset_normalizer-3.4.6-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:3778fd7d7cd04ae8f54651f4a7a0bd6e39a0cf20f801720a4c21d80e9b7ad6b0", size = 216938 }, + { url = "https://files.pythonhosted.org/packages/cb/05/5bd1e12da9ab18790af05c61aafd01a60f489778179b621ac2a305243c62/charset_normalizer-3.4.6-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:dad6e0f2e481fffdcf776d10ebee25e0ef89f16d691f1e5dee4b586375fdc64b", size = 210138 }, + { url = "https://files.pythonhosted.org/packages/bd/8e/3cb9e2d998ff6b21c0a1860343cb7b83eba9cdb66b91410e18fc4969d6ab/charset_normalizer-3.4.6-cp310-cp310-win32.whl", hash = "sha256:74a2e659c7ecbc73562e2a15e05039f1e22c75b7c7618b4b574a3ea9118d1557", size = 144137 }, + { url = "https://files.pythonhosted.org/packages/d8/8f/78f5489ffadb0db3eb7aff53d31c24531d33eb545f0c6f6567c25f49a5ff/charset_normalizer-3.4.6-cp310-cp310-win_amd64.whl", hash = "sha256:aa9cccf4a44b9b62d8ba8b4dd06c649ba683e4bf04eea606d2e94cfc2d6ff4d6", size = 154244 }, + { url = "https://files.pythonhosted.org/packages/e4/74/e472659dffb0cadb2f411282d2d76c60da1fc94076d7fffed4ae8a93ec01/charset_normalizer-3.4.6-cp310-cp310-win_arm64.whl", hash = "sha256:e985a16ff513596f217cee86c21371b8cd011c0f6f056d0920aa2d926c544058", size = 143312 }, + { url = "https://files.pythonhosted.org/packages/62/28/ff6f234e628a2de61c458be2779cb182bc03f6eec12200d4a525bbfc9741/charset_normalizer-3.4.6-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:82060f995ab5003a2d6e0f4ad29065b7672b6593c8c63559beefe5b443242c3e", size = 293582 }, + { url = "https://files.pythonhosted.org/packages/1c/b7/b1a117e5385cbdb3205f6055403c2a2a220c5ea80b8716c324eaf75c5c95/charset_normalizer-3.4.6-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:60c74963d8350241a79cb8feea80e54d518f72c26db618862a8f53e5023deaf9", size = 197240 }, + { url = "https://files.pythonhosted.org/packages/a1/5f/2574f0f09f3c3bc1b2f992e20bce6546cb1f17e111c5be07308dc5427956/charset_normalizer-3.4.6-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:f6e4333fb15c83f7d1482a76d45a0818897b3d33f00efd215528ff7c51b8e35d", size = 217363 }, + { url = "https://files.pythonhosted.org/packages/4a/d1/0ae20ad77bc949ddd39b51bf383b6ca932f2916074c95cad34ae465ab71f/charset_normalizer-3.4.6-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:bc72863f4d9aba2e8fd9085e63548a324ba706d2ea2c83b260da08a59b9482de", size = 212994 }, + { url = "https://files.pythonhosted.org/packages/60/ac/3233d262a310c1b12633536a07cde5ddd16985e6e7e238e9f3f9423d8eb9/charset_normalizer-3.4.6-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9cc4fc6c196d6a8b76629a70ddfcd4635a6898756e2d9cac5565cf0654605d73", size = 204697 }, + { url = "https://files.pythonhosted.org/packages/25/3c/8a18fc411f085b82303cfb7154eed5bd49c77035eb7608d049468b53f87c/charset_normalizer-3.4.6-cp311-cp311-manylinux_2_31_armv7l.whl", hash = "sha256:0c173ce3a681f309f31b87125fecec7a5d1347261ea11ebbb856fa6006b23c8c", size = 191673 }, + { url = "https://files.pythonhosted.org/packages/ff/a7/11cfe61d6c5c5c7438d6ba40919d0306ed83c9ab957f3d4da2277ff67836/charset_normalizer-3.4.6-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:c907cdc8109f6c619e6254212e794d6548373cc40e1ec75e6e3823d9135d29cc", size = 201120 }, + { url = "https://files.pythonhosted.org/packages/b5/10/cf491fa1abd47c02f69687046b896c950b92b6cd7337a27e6548adbec8e4/charset_normalizer-3.4.6-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:404a1e552cf5b675a87f0651f8b79f5f1e6fd100ee88dc612f89aa16abd4486f", size = 200911 }, + { url = "https://files.pythonhosted.org/packages/28/70/039796160b48b18ed466fde0af84c1b090c4e288fae26cd674ad04a2d703/charset_normalizer-3.4.6-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:e3c701e954abf6fc03a49f7c579cc80c2c6cc52525340ca3186c41d3f33482ef", size = 192516 }, + { url = "https://files.pythonhosted.org/packages/ff/34/c56f3223393d6ff3124b9e78f7de738047c2d6bc40a4f16ac0c9d7a1cb3c/charset_normalizer-3.4.6-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:7a6967aaf043bceabab5412ed6bd6bd26603dae84d5cb75bf8d9a74a4959d398", size = 218795 }, + { url = "https://files.pythonhosted.org/packages/e8/3b/ce2d4f86c5282191a041fdc5a4ce18f1c6bd40a5bd1f74cf8625f08d51c1/charset_normalizer-3.4.6-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:5feb91325bbceade6afab43eb3b508c63ee53579fe896c77137ded51c6b6958e", size = 201833 }, + { url = "https://files.pythonhosted.org/packages/3b/9b/b6a9f76b0fd7c5b5ec58b228ff7e85095370282150f0bd50b3126f5506d6/charset_normalizer-3.4.6-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:f820f24b09e3e779fe84c3c456cb4108a7aa639b0d1f02c28046e11bfcd088ed", size = 213920 }, + { url = "https://files.pythonhosted.org/packages/ae/98/7bc23513a33d8172365ed30ee3a3b3fe1ece14a395e5fc94129541fc6003/charset_normalizer-3.4.6-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:b35b200d6a71b9839a46b9b7fff66b6638bb52fc9658aa58796b0326595d3021", size = 206951 }, + { url = "https://files.pythonhosted.org/packages/32/73/c0b86f3d1458468e11aec870e6b3feac931facbe105a894b552b0e518e79/charset_normalizer-3.4.6-cp311-cp311-win32.whl", hash = "sha256:9ca4c0b502ab399ef89248a2c84c54954f77a070f28e546a85e91da627d1301e", size = 143703 }, + { url = "https://files.pythonhosted.org/packages/c6/e3/76f2facfe8eddee0bbd38d2594e709033338eae44ebf1738bcefe0a06185/charset_normalizer-3.4.6-cp311-cp311-win_amd64.whl", hash = "sha256:a9e68c9d88823b274cf1e72f28cb5dc89c990edf430b0bfd3e2fb0785bfeabf4", size = 153857 }, + { url = "https://files.pythonhosted.org/packages/e2/dc/9abe19c9b27e6cd3636036b9d1b387b78c40dedbf0b47f9366737684b4b0/charset_normalizer-3.4.6-cp311-cp311-win_arm64.whl", hash = "sha256:97d0235baafca5f2b09cf332cc275f021e694e8362c6bb9c96fc9a0eb74fc316", size = 142751 }, + { url = "https://files.pythonhosted.org/packages/e5/62/c0815c992c9545347aeea7859b50dc9044d147e2e7278329c6e02ac9a616/charset_normalizer-3.4.6-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:2ef7fedc7a6ecbe99969cd09632516738a97eeb8bd7258bf8a0f23114c057dab", size = 295154 }, + { url = "https://files.pythonhosted.org/packages/a8/37/bdca6613c2e3c58c7421891d80cc3efa1d32e882f7c4a7ee6039c3fc951a/charset_normalizer-3.4.6-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a4ea868bc28109052790eb2b52a9ab33f3aa7adc02f96673526ff47419490e21", size = 199191 }, + { url = "https://files.pythonhosted.org/packages/6c/92/9934d1bbd69f7f398b38c5dae1cbf9cc672e7c34a4adf7b17c0a9c17d15d/charset_normalizer-3.4.6-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:836ab36280f21fc1a03c99cd05c6b7af70d2697e374c7af0b61ed271401a72a2", size = 218674 }, + { url = "https://files.pythonhosted.org/packages/af/90/25f6ab406659286be929fd89ab0e78e38aa183fc374e03aa3c12d730af8a/charset_normalizer-3.4.6-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:f1ce721c8a7dfec21fcbdfe04e8f68174183cf4e8188e0645e92aa23985c57ff", size = 215259 }, + { url = "https://files.pythonhosted.org/packages/4e/ef/79a463eb0fff7f96afa04c1d4c51f8fc85426f918db467854bfb6a569ce3/charset_normalizer-3.4.6-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0e28d62a8fc7a1fa411c43bd65e346f3bce9716dc51b897fbe930c5987b402d5", size = 207276 }, + { url = "https://files.pythonhosted.org/packages/f7/72/d0426afec4b71dc159fa6b4e68f868cd5a3ecd918fec5813a15d292a7d10/charset_normalizer-3.4.6-cp312-cp312-manylinux_2_31_armv7l.whl", hash = "sha256:530d548084c4a9f7a16ed4a294d459b4f229db50df689bfe92027452452943a0", size = 195161 }, + { url = "https://files.pythonhosted.org/packages/bf/18/c82b06a68bfcb6ce55e508225d210c7e6a4ea122bfc0748892f3dc4e8e11/charset_normalizer-3.4.6-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:30f445ae60aad5e1f8bdbb3108e39f6fbc09f4ea16c815c66578878325f8f15a", size = 203452 }, + { url = "https://files.pythonhosted.org/packages/44/d6/0c25979b92f8adafdbb946160348d8d44aa60ce99afdc27df524379875cb/charset_normalizer-3.4.6-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:ac2393c73378fea4e52aa56285a3d64be50f1a12395afef9cce47772f60334c2", size = 202272 }, + { url = "https://files.pythonhosted.org/packages/2e/3d/7fea3e8fe84136bebbac715dd1221cc25c173c57a699c030ab9b8900cbb7/charset_normalizer-3.4.6-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:90ca27cd8da8118b18a52d5f547859cc1f8354a00cd1e8e5120df3e30d6279e5", size = 195622 }, + { url = "https://files.pythonhosted.org/packages/57/8a/d6f7fd5cb96c58ef2f681424fbca01264461336d2a7fc875e4446b1f1346/charset_normalizer-3.4.6-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:8e5a94886bedca0f9b78fecd6afb6629142fd2605aa70a125d49f4edc6037ee6", size = 220056 }, + { url = "https://files.pythonhosted.org/packages/16/50/478cdda782c8c9c3fb5da3cc72dd7f331f031e7f1363a893cdd6ca0f8de0/charset_normalizer-3.4.6-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:695f5c2823691a25f17bc5d5ffe79fa90972cc34b002ac6c843bb8a1720e950d", size = 203751 }, + { url = "https://files.pythonhosted.org/packages/75/fc/cc2fcac943939c8e4d8791abfa139f685e5150cae9f94b60f12520feaa9b/charset_normalizer-3.4.6-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:231d4da14bcd9301310faf492051bee27df11f2bc7549bc0bb41fef11b82daa2", size = 216563 }, + { url = "https://files.pythonhosted.org/packages/a8/b7/a4add1d9a5f68f3d037261aecca83abdb0ab15960a3591d340e829b37298/charset_normalizer-3.4.6-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:a056d1ad2633548ca18ffa2f85c202cfb48b68615129143915b8dc72a806a923", size = 209265 }, + { url = "https://files.pythonhosted.org/packages/6c/18/c094561b5d64a24277707698e54b7f67bd17a4f857bbfbb1072bba07c8bf/charset_normalizer-3.4.6-cp312-cp312-win32.whl", hash = "sha256:c2274ca724536f173122f36c98ce188fd24ce3dad886ec2b7af859518ce008a4", size = 144229 }, + { url = "https://files.pythonhosted.org/packages/ab/20/0567efb3a8fd481b8f34f739ebddc098ed062a59fed41a8d193a61939e8f/charset_normalizer-3.4.6-cp312-cp312-win_amd64.whl", hash = "sha256:c8ae56368f8cc97c7e40a7ee18e1cedaf8e780cd8bc5ed5ac8b81f238614facb", size = 154277 }, + { url = "https://files.pythonhosted.org/packages/15/57/28d79b44b51933119e21f65479d0864a8d5893e494cf5daab15df0247c17/charset_normalizer-3.4.6-cp312-cp312-win_arm64.whl", hash = "sha256:899d28f422116b08be5118ef350c292b36fc15ec2daeb9ea987c89281c7bb5c4", size = 142817 }, + { url = "https://files.pythonhosted.org/packages/1e/1d/4fdabeef4e231153b6ed7567602f3b68265ec4e5b76d6024cf647d43d981/charset_normalizer-3.4.6-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:11afb56037cbc4b1555a34dd69151e8e069bee82e613a73bef6e714ce733585f", size = 294823 }, + { url = "https://files.pythonhosted.org/packages/47/7b/20e809b89c69d37be748d98e84dce6820bf663cf19cf6b942c951a3e8f41/charset_normalizer-3.4.6-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:423fb7e748a08f854a08a222b983f4df1912b1daedce51a72bd24fe8f26a1843", size = 198527 }, + { url = "https://files.pythonhosted.org/packages/37/a6/4f8d27527d59c039dce6f7622593cdcd3d70a8504d87d09eb11e9fdc6062/charset_normalizer-3.4.6-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:d73beaac5e90173ac3deb9928a74763a6d230f494e4bfb422c217a0ad8e629bf", size = 218388 }, + { url = "https://files.pythonhosted.org/packages/f6/9b/4770ccb3e491a9bacf1c46cc8b812214fe367c86a96353ccc6daf87b01ec/charset_normalizer-3.4.6-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d60377dce4511655582e300dc1e5a5f24ba0cb229005a1d5c8d0cb72bb758ab8", size = 214563 }, + { url = "https://files.pythonhosted.org/packages/2b/58/a199d245894b12db0b957d627516c78e055adc3a0d978bc7f65ddaf7c399/charset_normalizer-3.4.6-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:530e8cebeea0d76bdcf93357aa5e41336f48c3dc709ac52da2bb167c5b8271d9", size = 206587 }, + { url = "https://files.pythonhosted.org/packages/7e/70/3def227f1ec56f5c69dfc8392b8bd63b11a18ca8178d9211d7cc5e5e4f27/charset_normalizer-3.4.6-cp313-cp313-manylinux_2_31_armv7l.whl", hash = "sha256:a26611d9987b230566f24a0a125f17fe0de6a6aff9f25c9f564aaa2721a5fb88", size = 194724 }, + { url = "https://files.pythonhosted.org/packages/58/ab/9318352e220c05efd31c2779a23b50969dc94b985a2efa643ed9077bfca5/charset_normalizer-3.4.6-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:34315ff4fc374b285ad7f4a0bf7dcbfe769e1b104230d40f49f700d4ab6bbd84", size = 202956 }, + { url = "https://files.pythonhosted.org/packages/75/13/f3550a3ac25b70f87ac98c40d3199a8503676c2f1620efbf8d42095cfc40/charset_normalizer-3.4.6-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:5f8ddd609f9e1af8c7bd6e2aca279c931aefecd148a14402d4e368f3171769fd", size = 201923 }, + { url = "https://files.pythonhosted.org/packages/1b/db/c5c643b912740b45e8eec21de1bbab8e7fc085944d37e1e709d3dcd9d72f/charset_normalizer-3.4.6-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:80d0a5615143c0b3225e5e3ef22c8d5d51f3f72ce0ea6fb84c943546c7b25b6c", size = 195366 }, + { url = "https://files.pythonhosted.org/packages/5a/67/3b1c62744f9b2448443e0eb160d8b001c849ec3fef591e012eda6484787c/charset_normalizer-3.4.6-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:92734d4d8d187a354a556626c221cd1a892a4e0802ccb2af432a1d85ec012194", size = 219752 }, + { url = "https://files.pythonhosted.org/packages/f6/98/32ffbaf7f0366ffb0445930b87d103f6b406bc2c271563644bde8a2b1093/charset_normalizer-3.4.6-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:613f19aa6e082cf96e17e3ffd89383343d0d589abda756b7764cf78361fd41dc", size = 203296 }, + { url = "https://files.pythonhosted.org/packages/41/12/5d308c1bbe60cabb0c5ef511574a647067e2a1f631bc8634fcafaccd8293/charset_normalizer-3.4.6-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:2b1a63e8224e401cafe7739f77efd3f9e7f5f2026bda4aead8e59afab537784f", size = 215956 }, + { url = "https://files.pythonhosted.org/packages/53/e9/5f85f6c5e20669dbe56b165c67b0260547dea97dba7e187938833d791687/charset_normalizer-3.4.6-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6cceb5473417d28edd20c6c984ab6fee6c6267d38d906823ebfe20b03d607dc2", size = 208652 }, + { url = "https://files.pythonhosted.org/packages/f1/11/897052ea6af56df3eef3ca94edafee410ca699ca0c7b87960ad19932c55e/charset_normalizer-3.4.6-cp313-cp313-win32.whl", hash = "sha256:d7de2637729c67d67cf87614b566626057e95c303bc0a55ffe391f5205e7003d", size = 143940 }, + { url = "https://files.pythonhosted.org/packages/a1/5c/724b6b363603e419829f561c854b87ed7c7e31231a7908708ac086cdf3e2/charset_normalizer-3.4.6-cp313-cp313-win_amd64.whl", hash = "sha256:572d7c822caf521f0525ba1bce1a622a0b85cf47ffbdae6c9c19e3b5ac3c4389", size = 154101 }, + { url = "https://files.pythonhosted.org/packages/01/a5/7abf15b4c0968e47020f9ca0935fb3274deb87cb288cd187cad92e8cdffd/charset_normalizer-3.4.6-cp313-cp313-win_arm64.whl", hash = "sha256:a4474d924a47185a06411e0064b803c68be044be2d60e50e8bddcc2649957c1f", size = 143109 }, + { url = "https://files.pythonhosted.org/packages/25/6f/ffe1e1259f384594063ea1869bfb6be5cdb8bc81020fc36c3636bc8302a1/charset_normalizer-3.4.6-cp314-cp314-macosx_10_15_universal2.whl", hash = "sha256:9cc6e6d9e571d2f863fa77700701dae73ed5f78881efc8b3f9a4398772ff53e8", size = 294458 }, + { url = "https://files.pythonhosted.org/packages/56/60/09bb6c13a8c1016c2ed5c6a6488e4ffef506461aa5161662bd7636936fb1/charset_normalizer-3.4.6-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ef5960d965e67165d75b7c7ffc60a83ec5abfc5c11b764ec13ea54fbef8b4421", size = 199277 }, + { url = "https://files.pythonhosted.org/packages/00/50/dcfbb72a5138bbefdc3332e8d81a23494bf67998b4b100703fd15fa52d81/charset_normalizer-3.4.6-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:b3694e3f87f8ac7ce279d4355645b3c878d24d1424581b46282f24b92f5a4ae2", size = 218758 }, + { url = "https://files.pythonhosted.org/packages/03/b3/d79a9a191bb75f5aa81f3aaaa387ef29ce7cb7a9e5074ba8ea095cc073c2/charset_normalizer-3.4.6-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5d11595abf8dd942a77883a39d81433739b287b6aa71620f15164f8096221b30", size = 215299 }, + { url = "https://files.pythonhosted.org/packages/76/7e/bc8911719f7084f72fd545f647601ea3532363927f807d296a8c88a62c0d/charset_normalizer-3.4.6-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7bda6eebafd42133efdca535b04ccb338ab29467b3f7bf79569883676fc628db", size = 206811 }, + { url = "https://files.pythonhosted.org/packages/e2/40/c430b969d41dda0c465aa36cc7c2c068afb67177bef50905ac371b28ccc7/charset_normalizer-3.4.6-cp314-cp314-manylinux_2_31_armv7l.whl", hash = "sha256:bbc8c8650c6e51041ad1be191742b8b421d05bbd3410f43fa2a00c8db87678e8", size = 193706 }, + { url = "https://files.pythonhosted.org/packages/48/15/e35e0590af254f7df984de1323640ef375df5761f615b6225ba8deb9799a/charset_normalizer-3.4.6-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:22c6f0c2fbc31e76c3b8a86fba1a56eda6166e238c29cdd3d14befdb4a4e4815", size = 202706 }, + { url = "https://files.pythonhosted.org/packages/5e/bd/f736f7b9cc5e93a18b794a50346bb16fbfd6b37f99e8f306f7951d27c17c/charset_normalizer-3.4.6-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:7edbed096e4a4798710ed6bc75dcaa2a21b68b6c356553ac4823c3658d53743a", size = 202497 }, + { url = "https://files.pythonhosted.org/packages/9d/ba/2cc9e3e7dfdf7760a6ed8da7446d22536f3d0ce114ac63dee2a5a3599e62/charset_normalizer-3.4.6-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:7f9019c9cb613f084481bd6a100b12e1547cf2efe362d873c2e31e4035a6fa43", size = 193511 }, + { url = "https://files.pythonhosted.org/packages/9e/cb/5be49b5f776e5613be07298c80e1b02a2d900f7a7de807230595c85a8b2e/charset_normalizer-3.4.6-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:58c948d0d086229efc484fe2f30c2d382c86720f55cd9bc33591774348ad44e0", size = 220133 }, + { url = "https://files.pythonhosted.org/packages/83/43/99f1b5dad345accb322c80c7821071554f791a95ee50c1c90041c157ae99/charset_normalizer-3.4.6-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:419a9d91bd238052642a51938af8ac05da5b3343becde08d5cdeab9046df9ee1", size = 203035 }, + { url = "https://files.pythonhosted.org/packages/87/9a/62c2cb6a531483b55dddff1a68b3d891a8b498f3ca555fbcf2978e804d9d/charset_normalizer-3.4.6-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:5273b9f0b5835ff0350c0828faea623c68bfa65b792720c453e22b25cc72930f", size = 216321 }, + { url = "https://files.pythonhosted.org/packages/6e/79/94a010ff81e3aec7c293eb82c28f930918e517bc144c9906a060844462eb/charset_normalizer-3.4.6-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:0e901eb1049fdb80f5bd11ed5ea1e498ec423102f7a9b9e4645d5b8204ff2815", size = 208973 }, + { url = "https://files.pythonhosted.org/packages/2a/57/4ecff6d4ec8585342f0c71bc03efaa99cb7468f7c91a57b105bcd561cea8/charset_normalizer-3.4.6-cp314-cp314-win32.whl", hash = "sha256:b4ff1d35e8c5bd078be89349b6f3a845128e685e751b6ea1169cf2160b344c4d", size = 144610 }, + { url = "https://files.pythonhosted.org/packages/80/94/8434a02d9d7f168c25767c64671fead8d599744a05d6a6c877144c754246/charset_normalizer-3.4.6-cp314-cp314-win_amd64.whl", hash = "sha256:74119174722c4349af9708993118581686f343adc1c8c9c007d59be90d077f3f", size = 154962 }, + { url = "https://files.pythonhosted.org/packages/46/4c/48f2cdbfd923026503dfd67ccea45c94fd8fe988d9056b468579c66ed62b/charset_normalizer-3.4.6-cp314-cp314-win_arm64.whl", hash = "sha256:e5bcc1a1ae744e0bb59641171ae53743760130600da8db48cbb6e4918e186e4e", size = 143595 }, + { url = "https://files.pythonhosted.org/packages/31/93/8878be7569f87b14f1d52032946131bcb6ebbd8af3e20446bc04053dc3f1/charset_normalizer-3.4.6-cp314-cp314t-macosx_10_15_universal2.whl", hash = "sha256:ad8faf8df23f0378c6d527d8b0b15ea4a2e23c89376877c598c4870d1b2c7866", size = 314828 }, + { url = "https://files.pythonhosted.org/packages/06/b6/fae511ca98aac69ecc35cde828b0a3d146325dd03d99655ad38fc2cc3293/charset_normalizer-3.4.6-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f5ea69428fa1b49573eef0cc44a1d43bebd45ad0c611eb7d7eac760c7ae771bc", size = 208138 }, + { url = "https://files.pythonhosted.org/packages/54/57/64caf6e1bf07274a1e0b7c160a55ee9e8c9ec32c46846ce59b9c333f7008/charset_normalizer-3.4.6-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:06a7e86163334edfc5d20fe104db92fcd666e5a5df0977cb5680a506fe26cc8e", size = 224679 }, + { url = "https://files.pythonhosted.org/packages/aa/cb/9ff5a25b9273ef160861b41f6937f86fae18b0792fe0a8e75e06acb08f1d/charset_normalizer-3.4.6-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:e1f6e2f00a6b8edb562826e4632e26d063ac10307e80f7461f7de3ad8ef3f077", size = 223475 }, + { url = "https://files.pythonhosted.org/packages/fc/97/440635fc093b8d7347502a377031f9605a1039c958f3cd18dcacffb37743/charset_normalizer-3.4.6-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:95b52c68d64c1878818687a473a10547b3292e82b6f6fe483808fb1468e2f52f", size = 215230 }, + { url = "https://files.pythonhosted.org/packages/cd/24/afff630feb571a13f07c8539fbb502d2ab494019492aaffc78ef41f1d1d0/charset_normalizer-3.4.6-cp314-cp314t-manylinux_2_31_armv7l.whl", hash = "sha256:7504e9b7dc05f99a9bbb4525c67a2c155073b44d720470a148b34166a69c054e", size = 199045 }, + { url = "https://files.pythonhosted.org/packages/e5/17/d1399ecdaf7e0498c327433e7eefdd862b41236a7e484355b8e0e5ebd64b/charset_normalizer-3.4.6-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:172985e4ff804a7ad08eebec0a1640ece87ba5041d565fff23c8f99c1f389484", size = 211658 }, + { url = "https://files.pythonhosted.org/packages/b5/38/16baa0affb957b3d880e5ac2144caf3f9d7de7bc4a91842e447fbb5e8b67/charset_normalizer-3.4.6-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:4be9f4830ba8741527693848403e2c457c16e499100963ec711b1c6f2049b7c7", size = 210769 }, + { url = "https://files.pythonhosted.org/packages/05/34/c531bc6ac4c21da9ddfddb3107be2287188b3ea4b53b70fc58f2a77ac8d8/charset_normalizer-3.4.6-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:79090741d842f564b1b2827c0b82d846405b744d31e84f18d7a7b41c20e473ff", size = 201328 }, + { url = "https://files.pythonhosted.org/packages/fa/73/a5a1e9ca5f234519c1953608a03fe109c306b97fdfb25f09182babad51a7/charset_normalizer-3.4.6-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:87725cfb1a4f1f8c2fc9890ae2f42094120f4b44db9360be5d99a4c6b0e03a9e", size = 225302 }, + { url = "https://files.pythonhosted.org/packages/ba/f6/cd782923d112d296294dea4bcc7af5a7ae0f86ab79f8fefbda5526b6cfc0/charset_normalizer-3.4.6-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:fcce033e4021347d80ed9c66dcf1e7b1546319834b74445f561d2e2221de5659", size = 211127 }, + { url = "https://files.pythonhosted.org/packages/0e/c5/0b6898950627af7d6103a449b22320372c24c6feda91aa24e201a478d161/charset_normalizer-3.4.6-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:ca0276464d148c72defa8bb4390cce01b4a0e425f3b50d1435aa6d7a18107602", size = 222840 }, + { url = "https://files.pythonhosted.org/packages/7d/25/c4bba773bef442cbdc06111d40daa3de5050a676fa26e85090fc54dd12f0/charset_normalizer-3.4.6-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:197c1a244a274bb016dd8b79204850144ef77fe81c5b797dc389327adb552407", size = 216890 }, + { url = "https://files.pythonhosted.org/packages/35/1a/05dacadb0978da72ee287b0143097db12f2e7e8d3ffc4647da07a383b0b7/charset_normalizer-3.4.6-cp314-cp314t-win32.whl", hash = "sha256:2a24157fa36980478dd1770b585c0f30d19e18f4fb0c47c13aa568f871718579", size = 155379 }, + { url = "https://files.pythonhosted.org/packages/5d/7a/d269d834cb3a76291651256f3b9a5945e81d0a49ab9f4a498964e83c0416/charset_normalizer-3.4.6-cp314-cp314t-win_amd64.whl", hash = "sha256:cd5e2801c89992ed8c0a3f0293ae83c159a60d9a5d685005383ef4caca77f2c4", size = 169043 }, + { url = "https://files.pythonhosted.org/packages/23/06/28b29fba521a37a8932c6a84192175c34d49f84a6d4773fa63d05f9aff22/charset_normalizer-3.4.6-cp314-cp314t-win_arm64.whl", hash = "sha256:47955475ac79cc504ef2704b192364e51d0d473ad452caedd0002605f780101c", size = 148523 }, + { url = "https://files.pythonhosted.org/packages/2a/68/687187c7e26cb24ccbd88e5069f5ef00eba804d36dde11d99aad0838ab45/charset_normalizer-3.4.6-py3-none-any.whl", hash = "sha256:947cf925bc916d90adba35a64c82aace04fa39b46b52d4630ece166655905a69", size = 61455 }, +] + +[[package]] +name = "foc-pr-report" +version = "0.1.0" +source = { editable = "." } +dependencies = [ + { name = "requests" }, +] + +[package.metadata] +requires-dist = [{ name = "requests", specifier = ">=2.31" }] + +[[package]] +name = "idna" +version = "3.11" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/6f/6d/0703ccc57f3a7233505399edb88de3cbd678da106337b9fcde432b65ed60/idna-3.11.tar.gz", hash = "sha256:795dafcc9c04ed0c1fb032c2aa73654d8e8c5023a7df64a53f39190ada629902", size = 194582 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/0e/61/66938bbb5fc52dbdf84594873d5b51fb1f7c7794e9c0f5bd885f30bc507b/idna-3.11-py3-none-any.whl", hash = "sha256:771a87f49d9defaf64091e6e6fe9c18d4833f140bd19464795bc32d966ca37ea", size = 71008 }, +] + +[[package]] +name = "requests" +version = "2.33.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "certifi" }, + { name = "charset-normalizer" }, + { name = "idna" }, + { name = "urllib3" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/5f/a4/98b9c7c6428a668bf7e42ebb7c79d576a1c3c1e3ae2d47e674b468388871/requests-2.33.1.tar.gz", hash = "sha256:18817f8c57c6263968bc123d237e3b8b08ac046f5456bd1e307ee8f4250d3517", size = 134120 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d7/8e/7540e8a2036f79a125c1d2ebadf69ed7901608859186c856fa0388ef4197/requests-2.33.1-py3-none-any.whl", hash = "sha256:4e6d1ef462f3626a1f0a0a9c42dd93c63bad33f9f1c1937509b8c5c8718ab56a", size = 64947 }, +] + +[[package]] +name = "urllib3" +version = "2.6.3" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/c7/24/5f1b3bdffd70275f6661c76461e25f024d5a38a46f04aaca912426a2b1d3/urllib3-2.6.3.tar.gz", hash = "sha256:1b62b6884944a57dbe321509ab94fd4d3b307075e0c2eae991ac71ee15ad38ed", size = 435556 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/39/08/aaaad47bc4e9dc8c725e68f9d04865dbcb2052843ff09c97b08904852d84/urllib3-2.6.3-py3-none-any.whl", hash = "sha256:bf272323e553dfb2e87d9bfd225ca7b0f467b919d7bbd355436d3fd37cb0acd4", size = 131584 }, +] diff --git a/foc_project14_client.py b/foc_project14_client.py new file mode 100644 index 0000000..15fa03b --- /dev/null +++ b/foc_project14_client.py @@ -0,0 +1,522 @@ +""" +Shared GitHub GraphQL client for FilOzone organization Project 14 (FOC board). + +Used by foc_wg_pr_notifier.py and foc-pr-report. +""" + +from __future__ import annotations + +from typing import Any, Dict, List, Optional + +import requests + +FILOZ_ORG = "FilOzone" +PROJECT_NUMBER = 14 +GRAPHQL_URL = "https://api.github.com/graphql" + +PROJECT_QUERY = """ +query($org: String!, $number: Int!) { + organization(login: $org) { + projectV2(number: $number) { + id + title + } + } +} +""" + +ITEMS_QUERY = """ +query($projectId: ID!, $cursor: String) { + node(id: $projectId) { + ... on ProjectV2 { + items(first: 100, after: $cursor) { + pageInfo { + hasNextPage + endCursor + } + nodes { + id + fieldValues(first: 20) { + nodes { + ... on ProjectV2ItemFieldTextValue { + text + field { ... on ProjectV2Field { name } } + } + ... on ProjectV2ItemFieldSingleSelectValue { + name + field { ... on ProjectV2SingleSelectField { name } } + } + ... on ProjectV2ItemFieldIterationValue { + title + field { ... on ProjectV2IterationField { name } } + } + } + } + content { + ... on PullRequest { + __typename + number + title + url + state + isDraft + createdAt + updatedAt + author { login } + assignees(first: 10) { + nodes { login } + } + reviewRequests(first: 10) { + nodes { + requestedReviewer { + ... on User { login } + ... on Team { name } + } + } + } + latestReviews(first: 10) { + nodes { + author { login } + state + } + } + repository { nameWithOwner } + milestone { title } + } + ... on Issue { + __typename + number + title + url + state + } + } + } + } + } + } +} +""" + + +def graphql_query( + session: requests.Session, + query: str, + variables: Optional[Dict[str, Any]] = None, +) -> Dict[str, Any]: + """Execute a GraphQL query against GitHub API.""" + payload: Dict[str, Any] = {"query": query} + if variables: + payload["variables"] = variables + + response = session.post(GRAPHQL_URL, json=payload, timeout=30) + response.raise_for_status() + + result = response.json() + if "errors" in result: + errs = result["errors"] + for e in errs: + if e.get("type") == "INSUFFICIENT_SCOPES": + msg = ( + "GitHub token is missing required OAuth/PAT scopes for Project v2 " + "(typically read:project). If you use GitHub CLI, run:\n" + " gh auth refresh -s read:project\n" + "Or create a PAT that includes the read:project scope. " + f"Original API message: {e.get('message', errs)}" + ) + raise Exception(msg) from None + raise Exception(f"GraphQL errors: {errs}") + + return result["data"] + + +def _projects_v2_rest_headers(session: requests.Session) -> Dict[str, str]: + """Headers for organization Project v2 REST endpoints.""" + h = {k: v for k, v in session.headers.items() if v is not None} + h["Accept"] = "application/vnd.github+json" + h["X-GitHub-Api-Version"] = "2022-11-28" + return h + + +def list_project_v2_field_ids_by_name( + session: requests.Session, + *, + org: str, + project_number: int, + verbose: bool = False, +) -> Dict[str, int]: + """Return custom field name -> REST numeric id (paginated).""" + url: Optional[str] = ( + f"https://api.github.com/orgs/{org}/projectsV2/{project_number}/fields" + ) + params: Optional[Dict[str, Any]] = {"per_page": 100} + by_name: Dict[str, int] = {} + page = 1 + + while url: + resp = session.get( + url, + params=params, + headers=_projects_v2_rest_headers(session), + timeout=60, + ) + resp.raise_for_status() + batch = resp.json() + if not isinstance(batch, list): + raise Exception(f"Unexpected /fields response type: {type(batch)}") + + for f in batch: + name = f.get("name") + fid = f.get("id") + if name is not None and fid is not None: + by_name[str(name)] = int(fid) + + if verbose: + print( + f"REST fields page {page}: +{len(batch)} " + f"(unique names {len(by_name)})", + flush=True, + ) + + next_url = resp.links.get("next", {}).get("url") + url = next_url + params = None + page += 1 + + return by_name + + +def fetch_pull_request_review_logins( + session: requests.Session, + owner: str, + repo: str, + pull_number: int, +) -> set[str]: + """ + Logins who submitted a non-pending pull request review (COMMENTED, APPROVED, CHANGES_REQUESTED, etc.). + + GitHub project search matches people here even when they are no longer in ``requested_reviewers``. + """ + logins: set[str] = set() + url: Optional[str] = ( + f"https://api.github.com/repos/{owner}/{repo}/pulls/{pull_number}/reviews" + ) + params: Optional[Dict[str, Any]] = {"per_page": 100} + + while url: + resp = session.get( + url, + params=params, + headers=_projects_v2_rest_headers(session), + timeout=60, + ) + resp.raise_for_status() + batch = resp.json() + if not isinstance(batch, list): + break + + for rev in batch: + if rev.get("state") == "PENDING": + continue + user = rev.get("user") or {} + if user.get("type") != "User": + continue + login = user.get("login") + if isinstance(login, str): + logins.add(login) + + next_url = resp.links.get("next", {}).get("url") + url = next_url + params = None + + return logins + + +def enrich_pull_items_with_submitted_reviewers( + session: requests.Session, + items: List[Dict[str, Any]], + *, + verbose: bool = True, +) -> None: + """ + Set ``content['_submitted_reviewer_logins']`` on each pull request item (sorted list of logins). + + Excludes the PR author. Complements ``requested_reviewers`` on the project card: the board UI + often lists people under **Reviewers** after they submit a formal PR review (including + COMMENTED) even if they are not in ``requested_reviewers`` anymore. See foc-pr-report README. + """ + pr_items = [ + it + for it in items + if (it.get("content") or {}).get("__typename") == "PullRequest" + ] + total = len(pr_items) + if verbose and total: + print( + f"Fetching submitted reviews for {total} pull request(s)...", + flush=True, + ) + + for i, item in enumerate(pr_items, start=1): + content = item["content"] + repo_full = (content.get("repository") or {}).get("nameWithOwner") + num = content.get("number") + if not repo_full or num is None or "/" not in repo_full: + content["_submitted_reviewer_logins"] = [] + continue + + owner, repo = repo_full.split("/", 1) + logins = fetch_pull_request_review_logins(session, owner, repo, int(num)) + author = (content.get("author") or {}).get("login") + if isinstance(author, str): + logins.discard(author) + + content["_submitted_reviewer_logins"] = sorted(logins) + + if verbose and i % 10 == 0: + print(f" reviews {i}/{total}...", flush=True) + + +def fetch_project_v2_items_rest( + session: requests.Session, + *, + org: str, + project_number: int, + query: str, + field_ids: Optional[List[int]] = None, + per_page: int = 100, + verbose: bool = True, +) -> List[Dict[str, Any]]: + """ + List organization Project v2 items via REST with server-side q filter. + + ``query`` uses the same project filter syntax as the board (e.g. is:pr, -status:...). + See https://docs.github.com/en/rest/projects/items#list-items-for-an-organization-owned-project + """ + url: Optional[str] = ( + f"https://api.github.com/orgs/{org}/projectsV2/{project_number}/items" + ) + params: Optional[Dict[str, Any]] = { + "per_page": per_page, + "q": query, + } + if field_ids: + params["fields"] = ",".join(str(i) for i in field_ids) + + all_rows: List[Dict[str, Any]] = [] + page = 1 + + while url: + if verbose: + print(f"REST items page {page}...", end="", flush=True) + + resp = session.get( + url, + params=params, + headers=_projects_v2_rest_headers(session), + timeout=120, + ) + resp.raise_for_status() + batch = resp.json() + if not isinstance(batch, list): + raise Exception(f"Unexpected /items response type: {type(batch)}") + + all_rows.extend(batch) + + if verbose: + print(f" got {len(batch)} (total {len(all_rows)})", flush=True) + + next_url = resp.links.get("next", {}).get("url") + url = next_url + params = None + page += 1 + + if verbose: + print(f"Total REST items matching q: {len(all_rows)}", flush=True) + return all_rows + + +def rest_board_item_to_graphql_node(item: Dict[str, Any]) -> Dict[str, Any]: + """Map a REST list-item JSON object to GraphQL ``items.nodes`` shape.""" + field_nodes: List[Dict[str, Any]] = [] + for f in item.get("fields") or []: + if f.get("name") != "Status": + continue + val = f.get("value") + if not isinstance(val, dict): + continue + name_obj = val.get("name") + raw: Optional[str] + if isinstance(name_obj, dict): + raw = name_obj.get("raw") + elif isinstance(name_obj, str): + raw = name_obj + else: + raw = None + if raw: + field_nodes.append({"name": raw, "field": {"name": "Status"}}) + + out: Dict[str, Any] = { + "id": item.get("node_id"), + "fieldValues": {"nodes": field_nodes}, + "content": None, + } + + content = item.get("content") + if not isinstance(content, dict): + return out + + api_url = content.get("url") or "" + if "/pulls/" not in api_url: + return out + + assignees = [ + {"login": a["login"]} + for a in (content.get("assignees") or []) + if isinstance(a, dict) and a.get("login") + ] + review_nodes: List[Dict[str, Any]] = [] + for r in content.get("requested_reviewers") or []: + if isinstance(r, dict) and r.get("login"): + review_nodes.append({"requestedReviewer": {"login": r["login"]}}) + + repo_full: Optional[str] = None + if "/repos/" in api_url: + try: + repo_full = api_url.split("/repos/", 1)[1].split("/pulls/", 1)[0] + except (IndexError, ValueError): + repo_full = None + + ms = content.get("milestone") + milestone: Optional[Dict[str, str]] = None + if isinstance(ms, dict) and ms.get("title"): + milestone = {"title": ms["title"]} + + st = content.get("state") + state_gql = st.upper() if isinstance(st, str) else "UNKNOWN" + + author_login: Optional[str] = None + u = content.get("user") + if isinstance(u, dict): + author_login = u.get("login") + + out["content"] = { + "__typename": "PullRequest", + "number": content.get("number"), + "title": content.get("title"), + "url": content.get("html_url"), + "state": state_gql, + "isDraft": bool(content.get("draft")), + "author": {"login": author_login} if author_login else {}, + "assignees": {"nodes": assignees}, + "reviewRequests": {"nodes": review_nodes}, + "repository": {"nameWithOwner": repo_full} if repo_full else {}, + "milestone": milestone, + } + return out + + +def fetch_project_board_items_rest_filtered( + session: requests.Session, + *, + org: str = FILOZ_ORG, + project_number: int = PROJECT_NUMBER, + filter_query: str, + verbose: bool = True, +) -> List[Dict[str, Any]]: + """ + List project items with REST ``q`` (server-side filter), expand Status, normalize to GraphQL nodes. + + Use this when the filter can be expressed in project search syntax so GitHub returns + fewer rows than listing the entire board. + """ + fields_map = list_project_v2_field_ids_by_name( + session, + org=org, + project_number=project_number, + verbose=verbose, + ) + status_id = fields_map.get("Status") + if status_id is None: + raise Exception( + f'No field named "Status" on org {org!r} project {project_number}', + ) + + raw = fetch_project_v2_items_rest( + session, + org=org, + project_number=project_number, + query=filter_query, + field_ids=[status_id], + verbose=verbose, + ) + return [rest_board_item_to_graphql_node(row) for row in raw] + + +def field_values_by_name(item: Dict[str, Any]) -> Dict[str, str]: + """Map project field name -> value for a project item node.""" + out: Dict[str, str] = {} + for fv in item.get("fieldValues", {}).get("nodes", []): + if not fv: + continue + field = fv.get("field", {}) + field_name = field.get("name") if field else None + if field_name: + value = fv.get("name") or fv.get("text") or fv.get("title") + if value is not None: + out[field_name] = value + return out + + +def fetch_all_project_items( + session: requests.Session, + *, + org: str = FILOZ_ORG, + project_number: int = PROJECT_NUMBER, + verbose: bool = True, +) -> List[Dict[str, Any]]: + """Fetch all Project V2 items with pagination. Returns raw `items.nodes` list.""" + project_data = graphql_query( + session, + PROJECT_QUERY, + {"org": org, "number": project_number}, + ) + + project = project_data["organization"]["projectV2"] + if not project: + raise Exception(f"Project {project_number} not found in {org}") + + project_id = project["id"] + if verbose: + print(f"Found project: {project['title']} (ID: {project_id})") + + all_items: List[Dict[str, Any]] = [] + cursor = None + page = 1 + + while True: + if verbose: + print(f"Fetching page {page}...", end="", flush=True) + + data = graphql_query( + session, + ITEMS_QUERY, + {"projectId": project_id, "cursor": cursor}, + ) + + items_data = data["node"]["items"] + nodes = items_data["nodes"] + all_items.extend(nodes) + + if verbose: + print(f" got {len(nodes)} items") + + if not items_data["pageInfo"]["hasNextPage"]: + break + + cursor = items_data["pageInfo"]["endCursor"] + page += 1 + + if verbose: + print(f"Total items fetched: {len(all_items)}") + return all_items diff --git a/foc_wg_pr_notifier.py b/foc_wg_pr_notifier.py index 9bfbebd..0aa0527 100644 --- a/foc_wg_pr_notifier.py +++ b/foc_wg_pr_notifier.py @@ -21,9 +21,7 @@ from urllib.parse import quote import argparse -# FilOzone Project 14 ID (from the project URL) -FILOZ_ORG = "FilOzone" -PROJECT_NUMBER = 14 +from foc_project14_client import fetch_all_project_items, field_values_by_name # Milestones to exclude (view 32 filter) EXCLUDED_MILESTONES = [ @@ -43,8 +41,6 @@ class FOCWGNotifier: """Fetches PRs from FilOzone Project 14 and posts to Slack.""" - GRAPHQL_URL = "https://api.github.com/graphql" - def __init__(self, github_token: str, slack_webhook_url: Optional[str] = None, user_map_path: Optional[str] = None): self.github_token = github_token @@ -75,147 +71,9 @@ def _load_user_map(self, user_map_path: Optional[str] = None) -> Dict[str, str]: print(f"Error parsing user map file: {e} (using empty mapping)") return {} - def _graphql_query(self, query: str, variables: Dict[str, Any] = None) -> Dict[str, Any]: - """Execute a GraphQL query against GitHub API.""" - payload = {'query': query} - if variables: - payload['variables'] = variables - - response = self.session.post(self.GRAPHQL_URL, json=payload, timeout=30) - response.raise_for_status() - - result = response.json() - if 'errors' in result: - raise Exception(f"GraphQL errors: {result['errors']}") - - return result['data'] - def fetch_project_items(self) -> List[Dict[str, Any]]: """Fetch all items from FilOzone Project 14 with pagination.""" - # First, get the project ID - project_query = """ - query($org: String!, $number: Int!) { - organization(login: $org) { - projectV2(number: $number) { - id - title - } - } - } - """ - - project_data = self._graphql_query(project_query, { - 'org': FILOZ_ORG, - 'number': PROJECT_NUMBER, - }) - - project = project_data['organization']['projectV2'] - if not project: - raise Exception(f"Project {PROJECT_NUMBER} not found in {FILOZ_ORG}") - - project_id = project['id'] - print(f"Found project: {project['title']} (ID: {project_id})") - - # Now fetch all items with pagination - items_query = """ - query($projectId: ID!, $cursor: String) { - node(id: $projectId) { - ... on ProjectV2 { - items(first: 100, after: $cursor) { - pageInfo { - hasNextPage - endCursor - } - nodes { - id - fieldValues(first: 20) { - nodes { - ... on ProjectV2ItemFieldTextValue { - text - field { ... on ProjectV2Field { name } } - } - ... on ProjectV2ItemFieldSingleSelectValue { - name - field { ... on ProjectV2SingleSelectField { name } } - } - ... on ProjectV2ItemFieldIterationValue { - title - field { ... on ProjectV2IterationField { name } } - } - } - } - content { - ... on PullRequest { - __typename - number - title - url - state - isDraft - createdAt - updatedAt - author { login } - assignees(first: 10) { - nodes { login } - } - reviewRequests(first: 10) { - nodes { - requestedReviewer { - ... on User { login } - ... on Team { name } - } - } - } - latestReviews(first: 10) { - nodes { - author { login } - state - } - } - repository { nameWithOwner } - milestone { title } - } - ... on Issue { - __typename - number - title - url - state - } - } - } - } - } - } - } - """ - - all_items = [] - cursor = None - page = 1 - - while True: - print(f"Fetching page {page}...", end="", flush=True) - - data = self._graphql_query(items_query, { - 'projectId': project_id, - 'cursor': cursor, - }) - - items_data = data['node']['items'] - nodes = items_data['nodes'] - all_items.extend(nodes) - - print(f" got {len(nodes)} items") - - if not items_data['pageInfo']['hasNextPage']: - break - - cursor = items_data['pageInfo']['endCursor'] - page += 1 - - print(f"Total items fetched: {len(all_items)}") - return all_items + return fetch_all_project_items(self.session, verbose=True) def filter_items(self, items: List[Dict[str, Any]]) -> List[Dict[str, Any]]: """Apply filters to the items (open PRs, non-draft, excluding certain milestones and Done status).""" @@ -238,17 +96,7 @@ def filter_items(self, items: List[Dict[str, Any]]) -> List[Dict[str, Any]]: if content.get('isDraft'): continue - # Get project field values - field_values = {} - for fv in item.get('fieldValues', {}).get('nodes', []): - if not fv: - continue - field = fv.get('field', {}) - field_name = field.get('name') if field else None - if field_name: - # Get the value (could be 'name', 'text', or 'title' depending on field type) - value = fv.get('name') or fv.get('text') or fv.get('title') - field_values[field_name] = value + field_values = field_values_by_name(item) # Filter: Exclude status "🎉 Done" status = field_values.get('Status') From 5d6087b0b9088f3856fd7a8fb46688e3c291b34e Mon Sep 17 00:00:00 2001 From: Steve Loeppky Date: Tue, 31 Mar 2026 20:49:28 -0700 Subject: [PATCH 2/3] feat(foc-pr-report): empty workload row and repo matrix totals - Add synthetic empty row (no assignee / no reviewer) with board search links - Add Total row and column to PR count by repository and status - Rename first workload column to who; document output in README - Wire CLI through render_full_markdown for both tables Made-with: Cursor --- foc-pr-report/README.md | 7 +- foc-pr-report/foc_pr_report/cli.py | 8 +- foc-pr-report/foc_pr_report/report.py | 168 ++++++++++++++++++++++++-- 3 files changed, 169 insertions(+), 14 deletions(-) diff --git a/foc-pr-report/README.md b/foc-pr-report/README.md index 2e12624..e94abbf 100644 --- a/foc-pr-report/README.md +++ b/foc-pr-report/README.md @@ -66,6 +66,11 @@ Those two can diverge. For example, you might **submit a review** (so the UI can **What this tool does:** For each PR in the report, the **reviewer** counts use the **union** of (1) `requested_reviewers` on the card and (2) human users who have submitted at least one non-`PENDING` pull request review (same review list as above), **excluding the PR author**. That is meant to stay close to how filtering by person on the board behaves and what you see in the **Reviewers** column—not “everyone who left an issue comment.” +## Output tables + +1. **Workload by person** — rows are GitHub users × board status; assignee and reviewer counts match the semantics above. Rows labeled **empty** (after all named users) count PRs with **no assignee** in the **assignee** column and PRs with **no reviewer** in the **reviewer** column (same reviewer union as above—only user logins; team-only review requests still show as empty here). The **assignee** cell links add `no:assignee`; the **reviewer** cell links add GitHub’s `review:none` (no submitted pull request reviews), which can differ slightly from this row’s count when a user is requested but has not submitted a review yet. The **empty** label itself links to the base filter only. +2. **PR count by repository and status** — rows are repositories (`owner/repo`), columns are statuses present in the filtered set, plus a **Total** column (row sums) and **Total** row (column sums). Repository names link to the base filter plus `repo:owner/name`. Status column headers link to the base filter plus `status:"…"` only. Each non-zero cell links to that repo and status combined. The **Total** column header and the bottom-right grand total link to the base filter only; row totals link like the repo row; the total row’s status cells link like the status column headers. + ## Links -Each column links to [View 2](https://github.com/orgs/FilOzone/projects/14/views/2) with a `filterQuery` matching that cell (base filter plus user, status, assignee, or reviewers as appropriate). +[View 2](https://github.com/orgs/FilOzone/projects/14/views/2) `filterQuery` values use the same base filter as the tool (`is:pr` excluding Done/Todo), plus qualifiers per cell as described above. diff --git a/foc-pr-report/foc_pr_report/cli.py b/foc-pr-report/foc_pr_report/cli.py index 529b5ed..1b35322 100644 --- a/foc-pr-report/foc_pr_report/cli.py +++ b/foc-pr-report/foc_pr_report/cli.py @@ -20,7 +20,11 @@ fetch_project_board_items_rest_filtered, ) -from foc_pr_report.report import BASE_FILTER, aggregate_rows, render_markdown # noqa: E402 +from foc_pr_report.report import ( # noqa: E402 + BASE_FILTER, + aggregate_rows, + render_full_markdown, +) def main() -> None: @@ -67,7 +71,7 @@ def main() -> None: items, verbose=not args.quiet, ) - md = render_markdown(aggregate_rows(items)) + md = render_full_markdown(items, aggregate_rows(items)) if args.output: Path(args.output).write_text(md, encoding="utf-8") diff --git a/foc-pr-report/foc_pr_report/report.py b/foc-pr-report/foc_pr_report/report.py index e494729..002a5f7 100644 --- a/foc-pr-report/foc_pr_report/report.py +++ b/foc-pr-report/foc_pr_report/report.py @@ -23,6 +23,9 @@ "✔️ Approved by reviewer", ] +# Synthetic row: PRs with no assignee (assignee column) or no reviewer (reviewer column). +EMPTY_ROW_LOGIN = "empty" + Row = Tuple[str, str, int, int] @@ -41,6 +44,13 @@ def _status_sort_key(status: str) -> Tuple[int, str]: return (1, status.lower()) +def _person_row_sort_key(t: Row) -> Tuple[int, str, Tuple[int, str], str]: + """Real users first (A–z), then the synthetic ``empty`` row, then by status.""" + login, status, _, _ = t + group = 1 if login == EMPTY_ROW_LOGIN else 0 + return (group, login.lower(), _status_sort_key(status), status) + + def aggregate_rows(items: List[Dict[str, Any]]) -> List[Row]: """Return (login, status, assignee_count, reviewer_count) for each user/status lane.""" assignee_counts: Dict[Tuple[str, str], int] = defaultdict(int) @@ -74,10 +84,17 @@ def aggregate_rows(items: List[Dict[str, Any]]) -> List[Row]: for login in content.get("_submitted_reviewer_logins") or []: reviewer_logins.add(login) - for login in assignee_logins: - assignee_counts[(login, status)] += 1 - for login in reviewer_logins: - reviewer_counts[(login, status)] += 1 + if not assignee_logins: + assignee_counts[(EMPTY_ROW_LOGIN, status)] += 1 + else: + for login in assignee_logins: + assignee_counts[(login, status)] += 1 + + if not reviewer_logins: + reviewer_counts[(EMPTY_ROW_LOGIN, status)] += 1 + else: + for login in reviewer_logins: + reviewer_counts[(login, status)] += 1 keys = set(assignee_counts) | set(reviewer_counts) rows: List[Row] = [] @@ -88,29 +105,158 @@ def aggregate_rows(items: List[Dict[str, Any]]) -> List[Row]: continue rows.append((login, status, a, r)) - rows.sort(key=lambda t: (t[0].lower(), _status_sort_key(t[1]), t[1])) + rows.sort(key=_person_row_sort_key) return rows def render_markdown(rows: List[Row]) -> str: """GFM table with linked username, state label, assignee count, reviewer count.""" lines = [ - "| github username | state | assignee | reviewer |", + "| who | state | assignee | reviewer |", "| --- | --- | --- | --- |", ] for login, status, a_count, r_count in rows: - user_q = _filter_body(BASE_FILTER, login) - state_q = _filter_body(BASE_FILTER, f'status:"{status}"', login) - assign_q = _filter_body(BASE_FILTER, f'status:"{status}"', f"assignee:{login}") - rev_q = _filter_body(BASE_FILTER, f'status:"{status}"', f"reviewers:{login}") + if login == EMPTY_ROW_LOGIN: + user_q = BASE_FILTER + state_q = _filter_body(BASE_FILTER, f'status:"{status}"') + assign_q = _filter_body(BASE_FILTER, f'status:"{status}"', "no:assignee") + rev_q = _filter_body(BASE_FILTER, f'status:"{status}"', "review:none") + label = "empty" + else: + user_q = _filter_body(BASE_FILTER, login) + state_q = _filter_body(BASE_FILTER, f'status:"{status}"', login) + assign_q = _filter_body(BASE_FILTER, f'status:"{status}"', f"assignee:{login}") + rev_q = _filter_body(BASE_FILTER, f'status:"{status}"', f"reviewers:{login}") + label = login lines.append( "| " - + f"[{login}]({view2_url(user_q)}) | " + + f"[{label}]({view2_url(user_q)}) | " + f"[{status}]({view2_url(state_q)}) | " + f"[{a_count}]({view2_url(assign_q)}) | " + f"[{r_count}]({view2_url(rev_q)}) |" ) return "\n".join(lines) + "\n" + + +def aggregate_repo_status_matrix( + items: List[Dict[str, Any]], +) -> Tuple[List[str], List[str], Dict[Tuple[str, str], int]]: + """ + Count PRs per (repository, Status) for items matching the same rules as aggregate_rows + (pull requests only; Status not Done/Todo). + """ + counts: Dict[Tuple[str, str], int] = defaultdict(int) + + for item in items: + content = item.get("content") + if not content or content.get("__typename") != "PullRequest": + continue + + field_values = field_values_by_name(item) + status = field_values.get("Status") + if not status or status in (STATUS_DONE, STATUS_TODO): + continue + + repo = (content.get("repository") or {}).get("nameWithOwner") + if not isinstance(repo, str) or not repo.strip(): + repo = "unknown" + + counts[(repo, status)] += 1 + + repos = sorted({r for (r, _) in counts}, key=str.lower) + statuses = sorted({s for (_, s) in counts}, key=lambda s: (_status_sort_key(s), s)) + return repos, statuses, dict(counts) + + +def render_repo_status_markdown( + items: List[Dict[str, Any]], +) -> str: + """ + Markdown table: rows = repository, columns = status, cells = PR count (linked to View 2). + + Row header links: BASE_FILTER + repo:owner/name. + Column header links: BASE_FILTER + status:\"…\" (no repo). + Cell links: BASE_FILTER + repo + status (intersection). + Total column: per-repo sum, linked like the row repo filter; header links BASE_FILTER. + Total row: per-status sum, linked like status headers; grand total links BASE_FILTER. + """ + repos, statuses, counts = aggregate_repo_status_matrix(items) + + if not repos or not statuses: + return "_No pull requests to show in this matrix._\n" + + lines: List[str] = [] + + status_headers = [ + f"[{status}]({view2_url(_filter_body(BASE_FILTER, f'status:\"{status}\"'))})" + for status in statuses + ] + total_col_header = f"[Total]({view2_url(BASE_FILTER)})" + lines.append("| Repository | " + " | ".join(status_headers) + f" | {total_col_header} |") + + col_count = len(statuses) + 2 # Repository + statuses + Total + sep = ["---"] * col_count + lines.append("| " + " | ".join(sep) + " |") + + column_totals = [sum(counts.get((r, s), 0) for r in repos) for s in statuses] + grand_total = sum(column_totals) + + for repo in repos: + if repo == "unknown": + first = "unknown" + row_q_repo = None + else: + row_q_repo = _filter_body(BASE_FILTER, f"repo:{repo}") + first = f"[{repo}]({view2_url(row_q_repo)})" + + row_out = [first] + row_sum = 0 + for status in statuses: + n = counts.get((repo, status), 0) + row_sum += n + if n == 0: + row_out.append("0") + else: + cell_q = _filter_body(BASE_FILTER, f"repo:{repo}", f'status:"{status}"') + row_out.append(f"[{n}]({view2_url(cell_q)})") + + if row_sum == 0: + row_out.append("0") + elif row_q_repo is not None: + row_out.append(f"[{row_sum}]({view2_url(row_q_repo)})") + else: + row_out.append(str(row_sum)) + + lines.append("| " + " | ".join(row_out) + " |") + + total_first = f"[Total]({view2_url(BASE_FILTER)})" + total_row_cells: List[str] = [total_first] + for status, col_sum in zip(statuses, column_totals, strict=True): + if col_sum == 0: + total_row_cells.append("0") + else: + col_q = _filter_body(BASE_FILTER, f'status:"{status}"') + total_row_cells.append(f"[{col_sum}]({view2_url(col_q)})") + if grand_total == 0: + total_row_cells.append("0") + else: + total_row_cells.append(f"[{grand_total}]({view2_url(BASE_FILTER)})") + lines.append("| " + " | ".join(total_row_cells) + " |") + + return "\n".join(lines) + "\n" + + +def render_full_markdown( + items: List[Dict[str, Any]], + person_rows: List[Row], +) -> str: + """Person-centric table then repository × status matrix.""" + parts = [render_markdown(person_rows).rstrip(), ""] + parts.append("## PR count by repository and status") + parts.append("") + parts.append(render_repo_status_markdown(items).rstrip()) + parts.append("") + return "\n".join(parts) From aa5d76db2463a686e8e969521f545261cd73bdb8 Mon Sep 17 00:00:00 2001 From: Steve Loeppky Date: Tue, 31 Mar 2026 20:55:42 -0700 Subject: [PATCH 3/3] docs(foc-pr-report): add PR count by individual heading Mirror the repository table structure with a ## section title and align README wording. Made-with: Cursor --- foc-pr-report/README.md | 2 +- foc-pr-report/foc_pr_report/report.py | 7 ++++++- 2 files changed, 7 insertions(+), 2 deletions(-) diff --git a/foc-pr-report/README.md b/foc-pr-report/README.md index e94abbf..f0dfbd0 100644 --- a/foc-pr-report/README.md +++ b/foc-pr-report/README.md @@ -68,7 +68,7 @@ Those two can diverge. For example, you might **submit a review** (so the UI can ## Output tables -1. **Workload by person** — rows are GitHub users × board status; assignee and reviewer counts match the semantics above. Rows labeled **empty** (after all named users) count PRs with **no assignee** in the **assignee** column and PRs with **no reviewer** in the **reviewer** column (same reviewer union as above—only user logins; team-only review requests still show as empty here). The **assignee** cell links add `no:assignee`; the **reviewer** cell links add GitHub’s `review:none` (no submitted pull request reviews), which can differ slightly from this row’s count when a user is requested but has not submitted a review yet. The **empty** label itself links to the base filter only. +1. **PR count by individual** — rows are GitHub users × board status; assignee and reviewer counts match the semantics above. Rows labeled **empty** (after all named users) count PRs with **no assignee** in the **assignee** column and PRs with **no reviewer** in the **reviewer** column (same reviewer union as above—only user logins; team-only review requests still show as empty here). The **assignee** cell links add `no:assignee`; the **reviewer** cell links add GitHub’s `review:none` (no submitted pull request reviews), which can differ slightly from this row’s count when a user is requested but has not submitted a review yet. The **empty** label itself links to the base filter only. 2. **PR count by repository and status** — rows are repositories (`owner/repo`), columns are statuses present in the filtered set, plus a **Total** column (row sums) and **Total** row (column sums). Repository names link to the base filter plus `repo:owner/name`. Status column headers link to the base filter plus `status:"…"` only. Each non-zero cell links to that repo and status combined. The **Total** column header and the bottom-right grand total link to the base filter only; row totals link like the repo row; the total row’s status cells link like the status column headers. ## Links diff --git a/foc-pr-report/foc_pr_report/report.py b/foc-pr-report/foc_pr_report/report.py index 002a5f7..5a0c7a7 100644 --- a/foc-pr-report/foc_pr_report/report.py +++ b/foc-pr-report/foc_pr_report/report.py @@ -254,7 +254,12 @@ def render_full_markdown( person_rows: List[Row], ) -> str: """Person-centric table then repository × status matrix.""" - parts = [render_markdown(person_rows).rstrip(), ""] + parts = [ + "## PR count by individual", + "", + render_markdown(person_rows).rstrip(), + "", + ] parts.append("## PR count by repository and status") parts.append("") parts.append(render_repo_status_markdown(items).rstrip())