diff --git a/CHANGELOG.md b/CHANGELOG.md index ce029b44..259f6b29 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -11,12 +11,19 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 ### Added - Artifactory archive entry download for virtual file packages (#525) +- `apm info [field]` command for inspecting package metadata and remote refs (#613) +- `apm info versions` field selector lists remote tags and branches via `git ls-remote` (#613) +- `apm outdated` command compares locked dependencies against remote refs (#613) +- `--parallel-checks` (`-j`) option on `apm outdated` for concurrent remote checks (default: 4) (#613) +- Rich progress feedback during `apm outdated` dependency checking (#613) +- `--global` flag on `apm info` for inspecting user-scope packages (#613) ### Changed - Scope resolution now happens once via `TargetProfile.for_scope()` and `resolve_targets()` -- integrators no longer need scope-aware parameters (#562) - Unified integration dispatch table in `dispatch.py` -- both install and uninstall import from one source of truth (#562) - Hook merge logic deduplicated: three copy-pasted JSON-merge methods replaced with `_integrate_merged_hooks()` + config dict (#562) +- `apm outdated` uses SHA comparison for branch-pinned deps instead of reporting them as `unknown` (#613) ### Fixed diff --git a/docs/src/content/docs/guides/dependencies.md b/docs/src/content/docs/guides/dependencies.md index f6b3035c..4f1e2ac8 100644 --- a/docs/src/content/docs/guides/dependencies.md +++ b/docs/src/content/docs/guides/dependencies.md @@ -214,7 +214,7 @@ apm deps list apm deps tree # Get package details -apm deps info apm-sample-package +apm info apm-sample-package ``` ### 4. Use Dependencies in Compilation @@ -801,7 +801,7 @@ curl -H "Authorization: token $GITHUB_CLI_PAT" https://api.github.com/user ```bash # Show detailed package information -apm deps info package-name +apm info package-name # Show full dependency tree apm deps tree @@ -856,4 +856,4 @@ apm compile - **[Context Guide](../../introduction/how-it-works/)** - Understanding the AI-Native Development framework - **[Creating Packages](../../introduction/key-concepts/)** - Build your own APM packages -Ready to create your own APM packages? See the [Context Guide](../../introduction/key-concepts/) for detailed instructions on building reusable context collections and agent workflows. \ No newline at end of file +Ready to create your own APM packages? See the [Context Guide](../../introduction/key-concepts/) for detailed instructions on building reusable context collections and agent workflows. diff --git a/docs/src/content/docs/reference/cli-commands.md b/docs/src/content/docs/reference/cli-commands.md index 24dc09dc..b80a2b41 100644 --- a/docs/src/content/docs/reference/cli-commands.md +++ b/docs/src/content/docs/reference/cli-commands.md @@ -601,6 +601,79 @@ curl -sSL https://aka.ms/apm-unix | sh powershell -ExecutionPolicy Bypass -c "irm https://aka.ms/apm-windows | iex" ``` +### `apm info` - Show package metadata or remote versions + +Show local metadata for an installed package, or query remote refs with a field selector. + +```bash +apm info PACKAGE [FIELD] [OPTIONS] +``` + +**Arguments:** +- `PACKAGE` - Package name, usually `owner/repo` or a short repo name +- `FIELD` - Optional field selector. Supported value: `versions` + +**Options:** +- `-g, --global` - Inspect package from user scope (`~/.apm/`) + +**Examples:** +```bash +# Show installed package metadata +apm info microsoft/apm-sample-package + +# Short-name lookup for an installed package +apm info apm-sample-package + +# List remote tags and branches without cloning +apm info microsoft/apm-sample-package versions + +# Inspect a package from user scope +apm info microsoft/apm-sample-package -g +``` + +**Behavior:** +- Without `FIELD`, reads installed package metadata from `apm_modules/` +- Shows package name, version, description, source, install path, context files, workflows, and hooks +- `versions` lists remote tags and branches without cloning the repository +- `versions` does not require the package to be installed locally + +### `apm outdated` - Check locked dependencies for updates + +Compare locked dependencies against remote refs to detect staleness. + +```bash +apm outdated [OPTIONS] +``` + +**Options:** +- `-g, --global` - Check user-scope dependencies from `~/.apm/` +- `-v, --verbose` - Show extra detail for outdated packages, including available tags +- `-j, --parallel-checks N` - Max concurrent remote checks (default: 4, 0 = sequential) + +**Examples:** +```bash +# Check project dependencies +apm outdated + +# Check user-scope dependencies +apm outdated --global + +# Show available tags for outdated packages +apm outdated --verbose + +# Use 8 parallel checks for large dependency sets +apm outdated -j 8 +``` + +**Behavior:** +- Reads the current lockfile (`apm.lock.yaml`; legacy `apm.lock` is migrated automatically) +- For tag-pinned deps: compares the locked semver tag against the latest available remote tag +- For branch-pinned deps: compares the locked commit SHA against the remote branch tip SHA +- For deps with no ref: compares against the default branch (main/master) tip SHA +- Displays `Package`, `Current`, `Latest`, and `Status` columns +- Status values are `up-to-date`, `outdated`, and `unknown` +- Local dependencies and Artifactory dependencies are skipped + ### `apm deps` - Manage APM package dependencies Manage APM package dependencies with installation status, tree visualization, and package information. @@ -680,32 +753,27 @@ company-website (local) - Version numbers from dependency package metadata - Version information for each dependency -#### `apm deps info` - Show detailed package information +#### `apm deps info` - Alias for `apm info` -Display comprehensive information about a specific installed package. +Backward-compatible alias for `apm info PACKAGE_NAME`. ```bash apm deps info PACKAGE_NAME ``` **Arguments:** -- `PACKAGE_NAME` - Name of the package to show information about +- `PACKAGE_NAME` - Installed package name to inspect **Examples:** ```bash -# Show details for compliance rules package +# Show installed package metadata apm deps info compliance-rules - -# Show info for design guidelines package -apm deps info design-guidelines ``` -**Output includes:** -- Complete package metadata (name, version, description, author) -- Source repository and installation details -- Detailed context file counts by type -- Agent workflow descriptions and counts -- Installation path and status +**Notes:** +- Produces the same local metadata output as `apm info PACKAGE_NAME` +- Use `apm info` in new docs and scripts +- For remote refs, use `apm info PACKAGE_NAME versions` #### `apm deps clean` - Remove all APM dependencies diff --git a/packages/apm-guide/.apm/skills/apm-usage/commands.md b/packages/apm-guide/.apm/skills/apm-usage/commands.md index c5a271d6..43f0e848 100644 --- a/packages/apm-guide/.apm/skills/apm-usage/commands.md +++ b/packages/apm-guide/.apm/skills/apm-usage/commands.md @@ -15,7 +15,9 @@ | `apm prune` | Remove orphaned packages | `--dry-run` | | `apm deps list` | List installed packages | `-g` global, `--all` both scopes | | `apm deps tree` | Show dependency tree | -- | -| `apm deps info PKG` | Show package details | -- | +| `apm info PKG [FIELD]` | Show package details or remote refs | `-g` global, `FIELD=versions` | +| `apm outdated` | Check locked deps via SHA/semver comparison | `-g` global, `-v` verbose, `-j N` parallel checks | +| `apm deps info PKG` | Alias for `apm info PKG` local metadata | -- | | `apm deps clean` | Clean dependency cache | `--dry-run`, `-y` skip confirm | | `apm deps update [PKGS...]` | Update specific packages | `--verbose`, `--force`, `--target`, `--parallel-downloads N` | diff --git a/src/apm_cli/cli.py b/src/apm_cli/cli.py index d276439b..a6987e58 100644 --- a/src/apm_cli/cli.py +++ b/src/apm_cli/cli.py @@ -20,11 +20,13 @@ from apm_cli.commands.compile import compile as compile_cmd from apm_cli.commands.config import config from apm_cli.commands.deps import deps +from apm_cli.commands.info import info as info_cmd from apm_cli.commands.init import init from apm_cli.commands.install import install from apm_cli.commands.list_cmd import list as list_cmd from apm_cli.commands.marketplace import marketplace, search as marketplace_search from apm_cli.commands.mcp import mcp +from apm_cli.commands.outdated import outdated as outdated_cmd from apm_cli.commands.pack import pack_cmd, unpack_cmd from apm_cli.commands.prune import prune from apm_cli.commands.run import preview, run @@ -57,6 +59,7 @@ def cli(ctx): # Register command groups cli.add_command(audit) cli.add_command(deps) +cli.add_command(info_cmd, name="info") cli.add_command(pack_cmd, name="pack") cli.add_command(unpack_cmd, name="unpack") cli.add_command(init) @@ -71,6 +74,7 @@ def cli(ctx): cli.add_command(config) cli.add_command(runtime) cli.add_command(mcp) +cli.add_command(outdated_cmd, name="outdated") cli.add_command(marketplace) cli.add_command(marketplace_search, name="search") diff --git a/src/apm_cli/commands/deps/cli.py b/src/apm_cli/commands/deps/cli.py index 29ae547e..75cc11fa 100644 --- a/src/apm_cli/commands/deps/cli.py +++ b/src/apm_cli/commands/deps/cli.py @@ -662,6 +662,8 @@ def update(packages, verbose, force, target, parallel_downloads, global_): @click.argument('package', required=True) def info(package: str): """Show detailed information about a specific package including context files and workflows.""" + from ..info import resolve_package_path, display_package_info + logger = CommandLogger("deps-info") project_root = Path(".") @@ -672,113 +674,5 @@ def info(package: str): logger.progress("Run 'apm install' to install dependencies first") sys.exit(1) - # Find the package directory - handle org/repo and deep sub-path structures - package_path = None - # First try direct path match (handles any depth: org/repo, org/repo/subdir/pkg) - direct_match = apm_modules_path / package - if direct_match.is_dir() and ( - (direct_match / APM_YML_FILENAME).exists() or (direct_match / SKILL_MD_FILENAME).exists() - ): - package_path = direct_match - else: - # Fallback: scan org/repo structure (2-level) for short package names - for org_dir in apm_modules_path.iterdir(): - if org_dir.is_dir() and not org_dir.name.startswith('.'): - for package_dir in org_dir.iterdir(): - if package_dir.is_dir() and not package_dir.name.startswith('.'): - if package_dir.name == package or f"{org_dir.name}/{package_dir.name}" == package: - package_path = package_dir - break - if package_path: - break - - if not package_path: - logger.error(f"Package '{package}' not found in apm_modules/") - logger.progress("Available packages:") - - for org_dir in apm_modules_path.iterdir(): - if org_dir.is_dir() and not org_dir.name.startswith('.'): - for package_dir in org_dir.iterdir(): - if package_dir.is_dir() and not package_dir.name.startswith('.'): - click.echo(f" - {org_dir.name}/{package_dir.name}") - sys.exit(1) - - try: - # Load package information - package_info = _get_detailed_package_info(package_path) - - # Display with Rich panel if available - try: - from rich.panel import Panel - from rich.console import Console - from rich.text import Text - console = Console() - - content_lines = [] - content_lines.append(f"[bold]Name:[/bold] {package_info['name']}") - content_lines.append(f"[bold]Version:[/bold] {package_info['version']}") - content_lines.append(f"[bold]Description:[/bold] {package_info['description']}") - content_lines.append(f"[bold]Author:[/bold] {package_info['author']}") - content_lines.append(f"[bold]Source:[/bold] {package_info['source']}") - content_lines.append(f"[bold]Install Path:[/bold] {package_info['install_path']}") - content_lines.append("") - content_lines.append("[bold]Context Files:[/bold]") - - for context_type, count in package_info['context_files'].items(): - if count > 0: - content_lines.append(f" * {count} {context_type}") - - if not any(count > 0 for count in package_info['context_files'].values()): - content_lines.append(" * No context files found") - - content_lines.append("") - content_lines.append("[bold]Agent Workflows:[/bold]") - if package_info['workflows'] > 0: - content_lines.append(f" * {package_info['workflows']} executable workflows") - else: - content_lines.append(" * No agent workflows found") - - if package_info.get('hooks', 0) > 0: - content_lines.append("") - content_lines.append("[bold]Hooks:[/bold]") - content_lines.append(f" * {package_info['hooks']} hook file(s)") - - content = "\n".join(content_lines) - panel = Panel(content, title=f"[i] Package Info: {package}", border_style="cyan") - console.print(panel) - - except ImportError: - # Fallback text display - click.echo(f"[i] Package Info: {package}") - click.echo("=" * 40) - click.echo(f"Name: {package_info['name']}") - click.echo(f"Version: {package_info['version']}") - click.echo(f"Description: {package_info['description']}") - click.echo(f"Author: {package_info['author']}") - click.echo(f"Source: {package_info['source']}") - click.echo(f"Install Path: {package_info['install_path']}") - click.echo("") - click.echo("Context Files:") - - for context_type, count in package_info['context_files'].items(): - if count > 0: - click.echo(f" * {count} {context_type}") - - if not any(count > 0 for count in package_info['context_files'].values()): - click.echo(" * No context files found") - - click.echo("") - click.echo("Agent Workflows:") - if package_info['workflows'] > 0: - click.echo(f" * {package_info['workflows']} executable workflows") - else: - click.echo(" * No agent workflows found") - - if package_info.get('hooks', 0) > 0: - click.echo("") - click.echo("Hooks:") - click.echo(f" * {package_info['hooks']} hook file(s)") - - except Exception as e: - logger.error(f"Error reading package information: {e}") - sys.exit(1) + package_path = resolve_package_path(package, apm_modules_path, logger) + display_package_info(package, package_path, logger) diff --git a/src/apm_cli/commands/info.py b/src/apm_cli/commands/info.py new file mode 100644 index 00000000..dab3b2ad --- /dev/null +++ b/src/apm_cli/commands/info.py @@ -0,0 +1,367 @@ +"""Top-level ``apm info`` command. + +Shows detailed metadata for an installed package. Also exposes helpers +reused by the backward-compatible ``apm deps info`` alias. +""" + +import sys +from pathlib import Path +from typing import Any, Dict, List, Optional + +import click + +from ..constants import APM_MODULES_DIR, APM_YML_FILENAME, SKILL_MD_FILENAME +from ..core.auth import AuthResolver +from ..core.command_logger import CommandLogger +from ..deps.github_downloader import GitHubPackageDownloader +from ..models.dependency.reference import DependencyReference +from ..models.dependency.types import GitReferenceType, RemoteRef +from ..utils.path_security import PathTraversalError, ensure_path_within, validate_path_segments +from .deps._utils import _get_detailed_package_info + + +# ------------------------------------------------------------------ +# Valid field names (extensible in follow-up tasks) +# ------------------------------------------------------------------ +VALID_FIELDS = ("versions",) + + +# ------------------------------------------------------------------ +# Shared helpers (used by both ``apm info`` and ``apm deps info``) +# ------------------------------------------------------------------ + + +def resolve_package_path( + package: str, + apm_modules_path: Path, + logger: CommandLogger, +) -> Optional[Path]: + """Locate the package directory inside *apm_modules_path*. + + Resolution order: + 1. Direct path match (handles ``org/repo`` and deeper sub-paths). + 2. Fallback two-level scan for short (repo-only) names. + + Returns *None* when path validation fails (traversal attempt). + Exits via ``sys.exit(1)`` when the package cannot be found so that + callers do not need to duplicate error handling. + """ + # Guard: reject traversal sequences before building any path + try: + validate_path_segments(package, context="package name") + except PathTraversalError as exc: + logger.error(str(exc)) + return None + + # 1 -- direct match + direct_match = apm_modules_path / package + + # Guard: ensure resolved path stays within apm_modules/ + try: + ensure_path_within(direct_match, apm_modules_path) + except PathTraversalError as exc: + logger.error(str(exc)) + return None + if direct_match.is_dir() and ( + (direct_match / APM_YML_FILENAME).exists() + or (direct_match / SKILL_MD_FILENAME).exists() + ): + return direct_match + + # 2 -- fallback scan + for org_dir in apm_modules_path.iterdir(): + if org_dir.is_dir() and not org_dir.name.startswith("."): + for package_dir in org_dir.iterdir(): + if package_dir.is_dir() and not package_dir.name.startswith("."): + if ( + package_dir.name == package + or f"{org_dir.name}/{package_dir.name}" == package + ): + return package_dir + + # Not found -- show available packages and exit + logger.error(f"Package '{package}' not found in apm_modules/") + logger.progress("Available packages:") + for org_dir in apm_modules_path.iterdir(): + if org_dir.is_dir() and not org_dir.name.startswith("."): + for package_dir in org_dir.iterdir(): + if package_dir.is_dir() and not package_dir.name.startswith("."): + click.echo(f" - {org_dir.name}/{package_dir.name}") + sys.exit(1) + + +def _lookup_lockfile_ref(package: str, project_root: Path): + """Return (ref, commit) from the lockfile for *package*, or ("", "").""" + try: + from ..deps.lockfile import LockFile, get_lockfile_path, migrate_lockfile_if_needed + + migrate_lockfile_if_needed(project_root) + lockfile_path = get_lockfile_path(project_root) + lockfile = LockFile.read(lockfile_path) + if lockfile is None: + return "", "" + + # Try exact key first, then substring match + dep = lockfile.dependencies.get(package) + if dep is None: + for key, d in lockfile.dependencies.items(): + if package in key or key.endswith(f"/{package}"): + dep = d + break + + if dep is not None: + return dep.resolved_ref or "", dep.resolved_commit or "" + except Exception: + pass + return "", "" + + +def display_package_info( + package: str, + package_path: Path, + logger: CommandLogger, + project_root: Optional[Path] = None, +) -> None: + """Load and render package metadata to the terminal. + + Uses a Rich panel when available, falling back to plain text. + When *project_root* is provided, the lockfile is consulted for + ref and commit information. + """ + try: + package_info = _get_detailed_package_info(package_path) + + # Look up lockfile entry for ref/commit info + locked_ref = "" + locked_commit = "" + if project_root is not None: + locked_ref, locked_commit = _lookup_lockfile_ref( + package, project_root + ) + + try: + from rich.panel import Panel + from rich.console import Console + + console = Console() + + content_lines = [] + content_lines.append(f"[bold]Name:[/bold] {package_info['name']}") + content_lines.append(f"[bold]Version:[/bold] {package_info['version']}") + content_lines.append( + f"[bold]Description:[/bold] {package_info['description']}" + ) + content_lines.append(f"[bold]Author:[/bold] {package_info['author']}") + content_lines.append(f"[bold]Source:[/bold] {package_info['source']}") + if locked_ref: + content_lines.append(f"[bold]Ref:[/bold] {locked_ref}") + if locked_commit: + content_lines.append( + f"[bold]Commit:[/bold] {locked_commit[:12]}" + ) + content_lines.append( + f"[bold]Install Path:[/bold] {package_info['install_path']}" + ) + content_lines.append("") + content_lines.append("[bold]Context Files:[/bold]") + + for context_type, count in package_info["context_files"].items(): + if count > 0: + content_lines.append(f" * {count} {context_type}") + + if not any( + count > 0 for count in package_info["context_files"].values() + ): + content_lines.append(" * No context files found") + + content_lines.append("") + content_lines.append("[bold]Agent Workflows:[/bold]") + if package_info["workflows"] > 0: + content_lines.append( + f" * {package_info['workflows']} executable workflows" + ) + else: + content_lines.append(" * No agent workflows found") + + if package_info.get("hooks", 0) > 0: + content_lines.append("") + content_lines.append("[bold]Hooks:[/bold]") + content_lines.append(f" * {package_info['hooks']} hook file(s)") + + content = "\n".join(content_lines) + panel = Panel( + content, + title=f"[[i]] Package Info: {package}", + border_style="cyan", + ) + console.print(panel) + + except ImportError: + # Fallback text display + click.echo(f"[i] Package Info: {package}") + click.echo("=" * 40) + click.echo(f"Name: {package_info['name']}") + click.echo(f"Version: {package_info['version']}") + click.echo(f"Description: {package_info['description']}") + click.echo(f"Author: {package_info['author']}") + click.echo(f"Source: {package_info['source']}") + if locked_ref: + click.echo(f"Ref: {locked_ref}") + if locked_commit: + click.echo(f"Commit: {locked_commit[:12]}") + click.echo(f"Install Path: {package_info['install_path']}") + click.echo("") + click.echo("Context Files:") + + for context_type, count in package_info["context_files"].items(): + if count > 0: + click.echo(f" * {count} {context_type}") + + if not any( + count > 0 for count in package_info["context_files"].values() + ): + click.echo(" * No context files found") + + click.echo("") + click.echo("Agent Workflows:") + if package_info["workflows"] > 0: + click.echo( + f" * {package_info['workflows']} executable workflows" + ) + else: + click.echo(" * No agent workflows found") + + if package_info.get("hooks", 0) > 0: + click.echo("") + click.echo("Hooks:") + click.echo(f" * {package_info['hooks']} hook file(s)") + + except Exception as e: + logger.error(f"Error reading package information: {e}") + sys.exit(1) + + +def display_versions(package: str, logger: CommandLogger) -> None: + """Query and display available remote versions (tags/branches). + + This is a purely remote operation -- it does NOT require the package + to be installed locally. It parses *package* as a + ``DependencyReference``, queries remote refs via + ``GitHubPackageDownloader.list_remote_refs``, and renders the result + as a Rich table (with a plain-text fallback). + """ + try: + dep_ref = DependencyReference.parse(package) + except ValueError as exc: + logger.error(f"Invalid package reference '{package}': {exc}") + sys.exit(1) + + try: + downloader = GitHubPackageDownloader(auth_resolver=AuthResolver()) + refs: List[RemoteRef] = downloader.list_remote_refs(dep_ref) + except RuntimeError as exc: + logger.error(f"Failed to list versions for '{package}': {exc}") + sys.exit(1) + + if not refs: + logger.progress(f"No versions found for '{package}'") + return + + # -- render with Rich table (fallback to plain text) --------------- + try: + from rich.console import Console + from rich.table import Table + + console = Console() + table = Table( + title=f"Available versions: {package}", + show_header=True, + header_style="bold cyan", + ) + table.add_column("Name", style="bold white") + table.add_column("Type", style="yellow") + table.add_column("Commit", style="dim white") + + for ref in refs: + table.add_row( + ref.name, + ref.ref_type.value, + ref.commit_sha[:8], + ) + + console.print(table) + + except ImportError: + # Plain-text fallback + click.echo(f"Available versions: {package}") + click.echo("-" * 50) + click.echo(f"{'Name':<30} {'Type':<10} {'Commit':<10}") + click.echo("-" * 50) + for ref in refs: + click.echo( + f"{ref.name:<30} {ref.ref_type.value:<10} " + f"{ref.commit_sha[:8]:<10}" + ) + + +# ------------------------------------------------------------------ +# Click command +# ------------------------------------------------------------------ + + +@click.command() +@click.argument("package", required=True) +@click.argument("field", required=False, default=None) +@click.option("--global", "-g", "global_", is_flag=True, default=False, + help="Inspect package from user scope (~/.apm/)") +def info(package: str, field: Optional[str], global_: bool): + """Show information about a package. + + Without FIELD, displays local metadata for an installed package. + With FIELD, queries specific data (may contact the remote). + + \b + Fields: + versions List available remote tags and branches + + \b + Examples: + apm info org/repo # Local metadata + apm info org/repo versions # Remote tags/branches + apm info org/repo -g # From user scope + """ + from ..core.scope import InstallScope, get_apm_dir + + logger = CommandLogger("info") + + # --- field validation (before any I/O) --- + if field is not None: + if field not in VALID_FIELDS: + valid_list = ", ".join(VALID_FIELDS) + logger.error( + f"Unknown field '{field}'. Valid fields: {valid_list}" + ) + sys.exit(1) + + if field == "versions": + display_versions(package, logger) + return + + # --- default: show local metadata --- + scope = InstallScope.USER if global_ else InstallScope.PROJECT + if global_: + project_root = get_apm_dir(scope) + apm_modules_path = project_root / APM_MODULES_DIR + else: + project_root = Path(".") + apm_modules_path = project_root / APM_MODULES_DIR + + if not apm_modules_path.exists(): + logger.error("No apm_modules/ directory found") + logger.progress("Run 'apm install' to install dependencies first") + sys.exit(1) + + package_path = resolve_package_path(package, apm_modules_path, logger) + if package_path is None: + sys.exit(1) + display_package_info(package, package_path, logger, project_root=project_root) diff --git a/src/apm_cli/commands/outdated.py b/src/apm_cli/commands/outdated.py new file mode 100644 index 00000000..2bef64cc --- /dev/null +++ b/src/apm_cli/commands/outdated.py @@ -0,0 +1,357 @@ +"""Check for outdated locked dependencies. + +Compares locked dependency commit SHAs against remote tip SHAs. +For tag-pinned deps, also shows the latest available semver tag. +""" + +import re +import sys + +import click + + +TAG_RE = re.compile(r"^v?\d+\.\d+\.\d+") + + +def _is_tag_ref(ref: str) -> bool: + """Return True when *ref* looks like a semver tag (v1.2.3 or 1.2.3).""" + return bool(TAG_RE.match(ref)) if ref else False + + +def _strip_v(ref: str) -> str: + """Strip leading 'v' prefix from a version string.""" + return ref[1:] if ref and ref.startswith("v") else (ref or "") + + +def _find_remote_tip(ref_name, remote_refs): + """Find the tip SHA for a branch ref from remote refs. + + If *ref_name* is empty/None, falls back to common default branch + names (main, master). + Returns the commit SHA string or None if not found. + """ + from ..models.dependency.types import GitReferenceType + + if not remote_refs: + return None + + branch_refs = {r.name: r.commit_sha for r in remote_refs + if r.ref_type == GitReferenceType.BRANCH} + + if ref_name: + return branch_refs.get(ref_name) + + # No ref specified -- find the default branch + for default in ("main", "master"): + if default in branch_refs: + return branch_refs[default] + + # Last resort: first branch in list + if branch_refs: + return next(iter(branch_refs.values())) + + return None + + +def _check_one_dep(dep, downloader, verbose): + """Check a single dependency against remote refs. + + Returns a result tuple: (package_name, current, latest, status, extra_tags) + This function is safe to call from a thread pool. + """ + from ..models.dependency.reference import DependencyReference + from ..models.dependency.types import GitReferenceType + from ..utils.version_checker import is_newer_version + + current_ref = dep.resolved_ref or "" + locked_sha = dep.resolved_commit or "" + package_name = dep.get_unique_key() + + # Build a DependencyReference to query remote refs + try: + # Use parse() to correctly handle all host types (GitHub, ADO, etc.) + full_url = f"{dep.host}/{dep.repo_url}" if dep.host else dep.repo_url + dep_ref = DependencyReference.parse(full_url) + except Exception: + return (package_name, current_ref or "(none)", "-", "unknown", []) + + # Fetch remote refs + try: + remote_refs = downloader.list_remote_refs(dep_ref) + except Exception: + return (package_name, current_ref or "(none)", "-", "unknown", []) + + is_tag = _is_tag_ref(current_ref) + + if is_tag: + tag_refs = [r for r in remote_refs if r.ref_type == GitReferenceType.TAG] + if not tag_refs: + return (package_name, current_ref, "-", "unknown", []) + + latest_tag = tag_refs[0].name + current_ver = _strip_v(current_ref) + latest_ver = _strip_v(latest_tag) + + if is_newer_version(current_ver, latest_ver): + extra = [r.name for r in tag_refs[:10]] if verbose else [] + return (package_name, current_ref, latest_tag, "outdated", extra) + else: + return (package_name, current_ref, latest_tag, "up-to-date", []) + else: + remote_tip_sha = _find_remote_tip(current_ref, remote_refs) + + if not remote_tip_sha: + return (package_name, current_ref or "(none)", "-", "unknown", []) + + display_ref = current_ref or "(default)" + if locked_sha and locked_sha != remote_tip_sha: + latest_display = remote_tip_sha[:8] + return (package_name, display_ref, latest_display, "outdated", []) + else: + return (package_name, display_ref, remote_tip_sha[:8], "up-to-date", []) + + +@click.command(name="outdated") +@click.option("--global", "-g", "global_", is_flag=True, default=False, + help="Check user-scope dependencies (~/.apm/)") +@click.option("--verbose", "-v", is_flag=True, default=False, + help="Show additional info (e.g., available tags for outdated deps)") +@click.option("--parallel-checks", "-j", type=int, default=4, + help="Max concurrent remote checks (default: 4, 0 = sequential)") +def outdated(global_, verbose, parallel_checks): + """Show outdated locked dependencies. + + Compares each locked dependency against the remote to detect staleness. + Tag-pinned deps use semver comparison; branch-pinned deps compare commit SHAs. + + \b + Examples: + apm outdated # Check project deps + apm outdated --global # Check user-scope deps + apm outdated --verbose # Show available tags + apm outdated -j 8 # Use 8 parallel checks + """ + from ..core.command_logger import CommandLogger + from ..core.scope import InstallScope, get_apm_dir + from ..deps.lockfile import LockFile, get_lockfile_path, migrate_lockfile_if_needed + + logger = CommandLogger("outdated", verbose=verbose) + + # Resolve scope and lockfile path + scope = InstallScope.USER if global_ else InstallScope.PROJECT + project_root = get_apm_dir(scope) + + migrate_lockfile_if_needed(project_root) + lockfile_path = get_lockfile_path(project_root) + lockfile = LockFile.read(lockfile_path) + + if lockfile is None: + scope_hint = "~/.apm/" if global_ else "current directory" + logger.error(f"No lockfile found in {scope_hint}") + sys.exit(1) + + if not lockfile.dependencies: + logger.success("No locked dependencies to check") + return + + # Lazy-init downloader only when we have deps to check + from ..core.auth import AuthResolver + from ..deps.github_downloader import GitHubPackageDownloader + + auth_resolver = AuthResolver() + downloader = GitHubPackageDownloader(auth_resolver=auth_resolver) + + # Filter to checkable deps (skip local + Artifactory) + checkable = [] + for key, dep in lockfile.dependencies.items(): + if dep.source == "local": + logger.verbose_detail(f"Skipping local dep: {key}") + continue + if dep.registry_prefix: + logger.verbose_detail(f"Skipping Artifactory dep: {key}") + continue + checkable.append(dep) + + if not checkable: + logger.success("No remote dependencies to check") + return + + # Check deps with progress feedback and optional parallelism + rows = _check_deps_with_progress(checkable, downloader, verbose, + parallel_checks, logger) + + if not rows: + logger.success("No remote dependencies to check") + return + + # Check if everything is up-to-date + has_outdated = any(status == "outdated" for _, _, _, status, _ in rows) + has_unknown = any(status == "unknown" for _, _, _, status, _ in rows) + + if not has_outdated and not has_unknown: + logger.success("All dependencies are up-to-date") + return + + # Render the table + try: + from rich.table import Table + + from ._helpers import _get_console + + console = _get_console() + if console is None: + raise ImportError("Rich console not available") + + table = Table( + title="Dependency Status", + show_header=True, + header_style="bold cyan", + ) + table.add_column("Package", style="white", min_width=20) + table.add_column("Current", style="white", min_width=10) + table.add_column("Latest", style="white", min_width=10) + table.add_column("Status", min_width=12) + + status_styles = { + "up-to-date": "green", + "outdated": "yellow", + "unknown": "dim", + } + + for package, current, latest, status, extra_tags in rows: + style = status_styles.get(status, "white") + table.add_row(package, current, latest, f"[{style}]{status}[/{style}]") + + if verbose and extra_tags: + tags_str = ", ".join(extra_tags) + table.add_row("", "", f"[dim]tags: {tags_str}[/dim]", "") + + console.print(table) + + except (ImportError, Exception): + # Fallback: plain text output + click.echo("Package Current Latest Status") + click.echo("-" * 65) + for package, current, latest, status, extra_tags in rows: + click.echo(f"{package:<24}{current:<13}{latest:<13}{status}") + if verbose and extra_tags: + click.echo(f"{'':24}tags: {', '.join(extra_tags)}") + + # Summary + outdated_count = sum(1 for _, _, _, s, _ in rows if s == "outdated") + if outdated_count: + logger.warning(f"{outdated_count} outdated " + f"{'dependency' if outdated_count == 1 else 'dependencies'} found") + elif has_unknown: + logger.progress("Some dependencies could not be checked (branch/commit refs)") + + +def _check_deps_with_progress(checkable, downloader, verbose, parallel_checks, + logger): + """Check all deps with Rich progress bar and optional parallelism.""" + rows = [] + total = len(checkable) + + try: + from rich.progress import ( + BarColumn, + Progress, + SpinnerColumn, + TaskProgressColumn, + TextColumn, + ) + + with Progress( + SpinnerColumn(), + TextColumn("[cyan]{task.description}[/cyan]"), + BarColumn(), + TaskProgressColumn(), + transient=True, + ) as progress: + if parallel_checks > 0 and total > 1: + rows = _check_parallel( + checkable, downloader, verbose, parallel_checks, + progress, logger, + ) + else: + task_id = progress.add_task( + f"Checking {total} dependencies", total=total, + ) + for dep in checkable: + short = dep.get_unique_key().split("/")[-1] + progress.update(task_id, description=f"Checking {short}") + result = _check_one_dep(dep, downloader, verbose) + rows.append(result) + progress.advance(task_id) + except ImportError: + # No Rich -- plain text feedback + logger.progress(f"Checking {total} dependencies...") + if parallel_checks > 0 and total > 1: + rows = _check_parallel_plain( + checkable, downloader, verbose, parallel_checks, + ) + else: + for dep in checkable: + rows.append(_check_one_dep(dep, downloader, verbose)) + + return rows + + +def _check_parallel(checkable, downloader, verbose, max_workers, + progress, logger): + """Run checks in parallel with Rich progress display.""" + from concurrent.futures import ThreadPoolExecutor, as_completed + + total = len(checkable) + max_workers = min(max_workers, total) + overall_id = progress.add_task( + f"Checking {total} dependencies", total=total, + ) + + results = {} + with ThreadPoolExecutor(max_workers=max_workers) as executor: + futures = {} + for dep in checkable: + short = dep.get_unique_key().split("/")[-1] + task_id = progress.add_task(f"Checking {short}", total=None) + fut = executor.submit(_check_one_dep, dep, downloader, verbose) + futures[fut] = (dep, task_id) + + for fut in as_completed(futures): + dep, task_id = futures[fut] + try: + result = fut.result() + except Exception: + pkg = dep.get_unique_key() + result = (pkg, "(none)", "-", "unknown", []) + results[dep.get_unique_key()] = result + progress.update(task_id, visible=False) + progress.advance(overall_id) + + # Preserve original order + return [results[dep.get_unique_key()] for dep in checkable + if dep.get_unique_key() in results] + + +def _check_parallel_plain(checkable, downloader, verbose, max_workers): + """Run checks in parallel without Rich (plain fallback).""" + from concurrent.futures import ThreadPoolExecutor, as_completed + + max_workers = min(max_workers, len(checkable)) + results = {} + with ThreadPoolExecutor(max_workers=max_workers) as executor: + futures = { + executor.submit(_check_one_dep, dep, downloader, verbose): dep + for dep in checkable + } + for fut in as_completed(futures): + dep = futures[fut] + try: + result = fut.result() + except Exception: + pkg = dep.get_unique_key() + result = (pkg, "(none)", "-", "unknown", []) + results[dep.get_unique_key()] = result + + return [results[dep.get_unique_key()] for dep in checkable + if dep.get_unique_key() in results] diff --git a/src/apm_cli/deps/github_downloader.py b/src/apm_cli/deps/github_downloader.py index 0e5a1535..469ddc9c 100644 --- a/src/apm_cli/deps/github_downloader.py +++ b/src/apm_cli/deps/github_downloader.py @@ -8,7 +8,7 @@ import time from datetime import datetime from pathlib import Path -from typing import Optional, Dict, Any, Callable +from typing import Optional, Dict, Any, Callable, List import random import re from typing import Union @@ -22,6 +22,7 @@ from ..models.apm_package import ( DependencyReference, PackageInfo, + RemoteRef, ResolvedReference, GitReferenceType, PackageType, @@ -734,6 +735,160 @@ def _clone_with_fallback(self, repo_url_base: str, target_path: Path, progress_r raise RuntimeError(error_msg) + # ------------------------------------------------------------------ + # Remote ref enumeration (no clone required) + # ------------------------------------------------------------------ + + @staticmethod + def _parse_ls_remote_output(output: str) -> List[RemoteRef]: + """Parse ``git ls-remote --tags --heads`` output into RemoteRef objects. + + Format per line: ``\\t`` + + For annotated tags git emits two lines:: + + refs/tags/v1.0.0 + refs/tags/v1.0.0^{} + + We want the commit SHA (from the ``^{}`` line) and skip the + tag-object-only line. + + Args: + output: Raw stdout from ``git ls-remote``. + + Returns: + Unsorted list of RemoteRef. + """ + tags: Dict[str, str] = {} # tag name -> commit sha + branches: List[RemoteRef] = [] + + for line in output.splitlines(): + line = line.strip() + if not line: + continue + parts = line.split("\t", 1) + if len(parts) != 2: + continue + sha, refname = parts[0].strip(), parts[1].strip() + + if refname.startswith("refs/tags/"): + tag_name = refname[len("refs/tags/"):] + if tag_name.endswith("^{}"): + # Dereferenced commit -- overwrite with the real commit SHA + tag_name = tag_name[:-3] + tags[tag_name] = sha + else: + # Only store if we haven't seen the deref line yet + tags.setdefault(tag_name, sha) + + elif refname.startswith("refs/heads/"): + branch_name = refname[len("refs/heads/"):] + branches.append(RemoteRef( + name=branch_name, + ref_type=GitReferenceType.BRANCH, + commit_sha=sha, + )) + + tag_refs = [ + RemoteRef(name=name, ref_type=GitReferenceType.TAG, commit_sha=sha) + for name, sha in tags.items() + ] + return tag_refs + branches + + @staticmethod + def _semver_sort_key(name: str): + """Return a sort key for semver-like tag names (descending). + + Non-semver tags sort after all semver tags, alphabetically. + """ + clean = name.lstrip("vV") + m = re.match(r"^(\d+)\.(\d+)\.(\d+)(.*)", clean) + if m: + # Negate for descending order within the first group + return (0, -int(m.group(1)), -int(m.group(2)), -int(m.group(3)), m.group(4)) + return (1, name) + + @classmethod + def _sort_remote_refs(cls, refs: List[RemoteRef]) -> List[RemoteRef]: + """Sort refs: tags first (semver descending), then branches alphabetically.""" + tags = [r for r in refs if r.ref_type == GitReferenceType.TAG] + branches = [r for r in refs if r.ref_type == GitReferenceType.BRANCH] + tags.sort(key=lambda r: cls._semver_sort_key(r.name)) + branches.sort(key=lambda r: r.name) + return tags + branches + + def list_remote_refs(self, dep_ref: DependencyReference) -> List[RemoteRef]: + """Enumerate remote tags and branches without cloning. + + Uses ``git ls-remote --tags --heads`` for all git hosts (GitHub, + Azure DevOps, GitLab, generic). Artifactory dependencies return + an empty list (no git repo). + + Args: + dep_ref: Dependency reference describing the remote repo. + + Returns: + Sorted list of RemoteRef -- tags first (semver descending), + then branches (alphabetically ascending). + + Raises: + RuntimeError: If the git command fails. + """ + # Artifactory: no git repo to query + if dep_ref.is_artifactory(): + return [] + + is_ado = dep_ref.is_azure_devops() + dep_token = self._resolve_dep_token(dep_ref) + + # All git hosts: git ls-remote + repo_url_base = dep_ref.repo_url + + # Build the env -- mirror _clone_with_fallback logic + if dep_token: + ls_env = self.git_env + else: + ls_env = { + k: v for k, v in self.git_env.items() + if k not in ("GIT_ASKPASS", "GIT_CONFIG_GLOBAL", "GIT_CONFIG_NOSYSTEM") + } + ls_env["GIT_TERMINAL_PROMPT"] = "0" + + # Build authenticated URL + remote_url = self._build_repo_url( + repo_url_base, use_ssh=False, dep_ref=dep_ref, token=dep_token, + ) + + try: + g = git.cmd.Git() + output = g.ls_remote("--tags", "--heads", remote_url, env=ls_env) + refs = self._parse_ls_remote_output(output) + return self._sort_remote_refs(refs) + except GitCommandError as e: + dep_host = dep_ref.host + if dep_host: + is_github = is_github_hostname(dep_host) + else: + is_github = True + is_generic = not is_ado and not is_github + + error_msg = f"Failed to list remote refs for {repo_url_base}. " + if is_generic: + host_name = dep_host or "the target host" + error_msg += ( + f"For private repositories on {host_name}, configure SSH keys " + f"or a git credential helper. " + f"APM delegates authentication to git for non-GitHub/ADO hosts." + ) + else: + host = dep_host or default_host() + org = repo_url_base.split("/")[0] if repo_url_base else None + error_msg += self.auth_resolver.build_error_context(host, "list refs", org=org) + + sanitized = self._sanitize_git_error(str(e)) + error_msg += f" Last error: {sanitized}" + raise RuntimeError(error_msg) from e + def resolve_git_reference(self, repo_ref: Union[str, "DependencyReference"]) -> ResolvedReference: """Resolve a Git reference (branch/tag/commit) to a specific commit SHA. diff --git a/src/apm_cli/models/apm_package.py b/src/apm_cli/models/apm_package.py index 367177eb..a691c2e9 100644 --- a/src/apm_cli/models/apm_package.py +++ b/src/apm_cli/models/apm_package.py @@ -15,6 +15,7 @@ DependencyReference, GitReferenceType, MCPDependency, + RemoteRef, ResolvedReference, parse_git_reference, ) @@ -33,6 +34,7 @@ "DependencyReference", "GitReferenceType", "MCPDependency", + "RemoteRef", "ResolvedReference", "parse_git_reference", # Backward-compatible re-exports from .validation diff --git a/src/apm_cli/models/dependency/__init__.py b/src/apm_cli/models/dependency/__init__.py index edab860c..538ac9aa 100644 --- a/src/apm_cli/models/dependency/__init__.py +++ b/src/apm_cli/models/dependency/__init__.py @@ -2,12 +2,13 @@ from .mcp import MCPDependency from .reference import DependencyReference -from .types import GitReferenceType, ResolvedReference, VirtualPackageType, parse_git_reference +from .types import GitReferenceType, RemoteRef, ResolvedReference, VirtualPackageType, parse_git_reference __all__ = [ "DependencyReference", "GitReferenceType", "MCPDependency", + "RemoteRef", "ResolvedReference", "VirtualPackageType", "parse_git_reference", diff --git a/src/apm_cli/models/dependency/types.py b/src/apm_cli/models/dependency/types.py index 7344cc92..6779474b 100644 --- a/src/apm_cli/models/dependency/types.py +++ b/src/apm_cli/models/dependency/types.py @@ -13,6 +13,15 @@ class GitReferenceType(Enum): COMMIT = "commit" +@dataclass +class RemoteRef: + """A single remote Git reference (tag or branch) with its commit SHA.""" + + name: str + ref_type: GitReferenceType + commit_sha: str + + class VirtualPackageType(Enum): """Type of virtual package.""" FILE = "file" # Individual file (*.prompt.md, etc.) diff --git a/tests/unit/test_deps_list_tree_info.py b/tests/unit/test_deps_list_tree_info.py index 9262f0ab..f9b14128 100644 --- a/tests/unit/test_deps_list_tree_info.py +++ b/tests/unit/test_deps_list_tree_info.py @@ -372,3 +372,33 @@ def test_info_shows_no_workflows_for_bare_package(self): result = self.runner.invoke(cli, ["deps", "info", "wforg/wfrepo"]) assert result.exit_code == 0 assert "No agent workflows found" in result.output + + def test_deps_info_is_identical_to_apm_info(self): + """``apm deps info `` and ``apm info `` produce the same output. + + This is the explicit backward-compatibility contract: the ``deps info`` + alias must not diverge from the top-level ``info`` command. + """ + with self._chdir_tmp() as tmp: + self._make_package( + tmp, + "compatorg", + "compatrepo", + version="3.0.0", + description="Compat test package", + author="BackCompatAuthor", + ) + os.chdir(tmp) + with _force_rich_fallback(): + result_deps = self.runner.invoke( + cli, ["deps", "info", "compatorg/compatrepo"] + ) + with _force_rich_fallback(): + result_info = self.runner.invoke( + cli, ["info", "compatorg/compatrepo"] + ) + + assert result_deps.exit_code == 0 + assert result_info.exit_code == 0 + # The output must be identical (same handler, same display logic) + assert result_deps.output == result_info.output diff --git a/tests/unit/test_info_command.py b/tests/unit/test_info_command.py new file mode 100644 index 00000000..b70e0e5d --- /dev/null +++ b/tests/unit/test_info_command.py @@ -0,0 +1,358 @@ +"""Tests for the top-level ``apm info`` command.""" + +import contextlib +import os +import sys +import tempfile +import types +from pathlib import Path +from unittest.mock import MagicMock, patch + +import pytest +from click.testing import CliRunner + +from apm_cli.cli import cli +from apm_cli.models.dependency.types import GitReferenceType, RemoteRef + + +# ------------------------------------------------------------------ +# Rich-fallback helper (same approach as test_deps_list_tree_info.py) +# ------------------------------------------------------------------ + +def _force_rich_fallback(): + """Context-manager that forces the text-only code path.""" + + @contextlib.contextmanager + def _ctx(): + keys = [ + "rich", + "rich.console", + "rich.table", + "rich.tree", + "rich.panel", + "rich.text", + ] + originals = {k: sys.modules.get(k) for k in keys} + + for k in keys: + stub = types.ModuleType(k) + stub.__path__ = [] + + def _raise(name, _k=k): + raise ImportError(f"rich not available in test: {_k}") + + stub.__getattr__ = _raise + sys.modules[k] = stub + + try: + yield + finally: + for k, v in originals.items(): + if v is None: + sys.modules.pop(k, None) + else: + sys.modules[k] = v + + return _ctx() + + +# ------------------------------------------------------------------ +# Base class with temp-dir helpers +# ------------------------------------------------------------------ + + +class _InfoCmdBase: + """Shared CWD-management helpers.""" + + def setup_method(self): + self.runner = CliRunner() + try: + self.original_dir = os.getcwd() + except FileNotFoundError: + self.original_dir = str(Path(__file__).parent.parent.parent) + os.chdir(self.original_dir) + + def teardown_method(self): + try: + os.chdir(self.original_dir) + except (FileNotFoundError, OSError): + repo_root = Path(__file__).parent.parent.parent + os.chdir(str(repo_root)) + + @contextlib.contextmanager + def _chdir_tmp(self): + with tempfile.TemporaryDirectory() as tmp_dir: + try: + os.chdir(tmp_dir) + yield Path(tmp_dir) + finally: + os.chdir(self.original_dir) + + @staticmethod + def _make_package(root: Path, org: str, repo: str, **kwargs) -> Path: + pkg_dir = root / "apm_modules" / org / repo + pkg_dir.mkdir(parents=True) + version = kwargs.get("version", "1.0.0") + description = kwargs.get("description", "A test package") + author = kwargs.get("author", "TestAuthor") + content = ( + f"name: {repo}\nversion: {version}\n" + f"description: {description}\nauthor: {author}\n" + ) + (pkg_dir / "apm.yml").write_text(content) + return pkg_dir + + +# ------------------------------------------------------------------ +# Tests +# ------------------------------------------------------------------ + + +class TestInfoCommand(_InfoCmdBase): + """Tests for the top-level ``apm info`` command.""" + + # -- basic metadata display ------------------------------------------- + + def test_info_shows_package_details(self): + """``apm info org/repo`` shows package metadata (fallback mode).""" + with self._chdir_tmp() as tmp: + self._make_package( + tmp, + "myorg", + "myrepo", + version="2.5.0", + description="My awesome package", + author="Alice", + ) + os.chdir(tmp) + with _force_rich_fallback(): + result = self.runner.invoke(cli, ["info", "myorg/myrepo"]) + assert result.exit_code == 0 + assert "2.5.0" in result.output + assert "My awesome package" in result.output + assert "Alice" in result.output + + # -- missing apm_modules/ --------------------------------------------- + + def test_info_no_apm_modules(self): + """``apm info`` exits with error when apm_modules/ is missing.""" + with self._chdir_tmp(): + result = self.runner.invoke(cli, ["info", "noorg/norepo"]) + assert result.exit_code == 1 + + # -- field: versions (placeholder) ------------------------------------ + + def test_info_versions_lists_refs(self): + """``apm info org/repo versions`` shows tags and branches.""" + mock_refs = [ + RemoteRef(name="v2.0.0", ref_type=GitReferenceType.TAG, + commit_sha="aabbccdd11223344"), + RemoteRef(name="v1.0.0", ref_type=GitReferenceType.TAG, + commit_sha="11223344aabbccdd"), + RemoteRef(name="main", ref_type=GitReferenceType.BRANCH, + commit_sha="deadbeef12345678"), + ] + with patch( + "apm_cli.commands.info.GitHubPackageDownloader" + ) as mock_cls, patch( + "apm_cli.commands.info.AuthResolver" + ): + mock_cls.return_value.list_remote_refs.return_value = mock_refs + with _force_rich_fallback(): + result = self.runner.invoke( + cli, ["info", "myorg/myrepo", "versions"] + ) + assert result.exit_code == 0 + assert "v2.0.0" in result.output + assert "v1.0.0" in result.output + assert "main" in result.output + assert "tag" in result.output + assert "branch" in result.output + assert "aabbccdd" in result.output + assert "deadbeef" in result.output + + def test_info_versions_empty_refs(self): + """``apm info org/repo versions`` with no refs shows info message.""" + with patch( + "apm_cli.commands.info.GitHubPackageDownloader" + ) as mock_cls, patch( + "apm_cli.commands.info.AuthResolver" + ): + mock_cls.return_value.list_remote_refs.return_value = [] + result = self.runner.invoke( + cli, ["info", "myorg/myrepo", "versions"] + ) + assert result.exit_code == 0 + assert "no versions found" in result.output.lower() + + def test_info_versions_runtime_error(self): + """``apm info org/repo versions`` exits 1 on RuntimeError.""" + with patch( + "apm_cli.commands.info.GitHubPackageDownloader" + ) as mock_cls, patch( + "apm_cli.commands.info.AuthResolver" + ): + mock_cls.return_value.list_remote_refs.side_effect = RuntimeError( + "auth failed" + ) + result = self.runner.invoke( + cli, ["info", "myorg/myrepo", "versions"] + ) + assert result.exit_code == 1 + assert "failed to list versions" in result.output.lower() + + def test_info_versions_with_ref_shorthand(self): + """``apm info owner/repo#v1.0 versions`` parses ref correctly.""" + mock_refs = [ + RemoteRef(name="v1.0.0", ref_type=GitReferenceType.TAG, + commit_sha="abcdef1234567890"), + ] + with patch( + "apm_cli.commands.info.GitHubPackageDownloader" + ) as mock_cls, patch( + "apm_cli.commands.info.AuthResolver" + ): + mock_cls.return_value.list_remote_refs.return_value = mock_refs + with _force_rich_fallback(): + result = self.runner.invoke( + cli, ["info", "myorg/myrepo#v1.0", "versions"] + ) + assert result.exit_code == 0 + assert "v1.0.0" in result.output + + def test_info_versions_does_not_require_apm_modules(self): + """``apm info org/repo versions`` works without apm_modules/.""" + mock_refs = [ + RemoteRef(name="main", ref_type=GitReferenceType.BRANCH, + commit_sha="1234567890abcdef"), + ] + with self._chdir_tmp(): + # No apm_modules/ created -- should still succeed + with patch( + "apm_cli.commands.info.GitHubPackageDownloader" + ) as mock_cls, patch( + "apm_cli.commands.info.AuthResolver" + ): + mock_cls.return_value.list_remote_refs.return_value = mock_refs + with _force_rich_fallback(): + result = self.runner.invoke( + cli, ["info", "myorg/myrepo", "versions"] + ) + assert result.exit_code == 0 + assert "main" in result.output + + # -- invalid field ---------------------------------------------------- + + def test_info_invalid_field(self): + """``apm info org/repo bad-field`` shows error with valid fields.""" + with self._chdir_tmp() as tmp: + self._make_package(tmp, "forg", "frepo") + os.chdir(tmp) + result = self.runner.invoke(cli, ["info", "forg/frepo", "bad-field"]) + assert result.exit_code == 1 + assert "bad-field" in result.output + assert "versions" in result.output + + # -- short name resolution -------------------------------------------- + + def test_info_short_package_name(self): + """``apm info repo`` resolves by short repo name.""" + with self._chdir_tmp() as tmp: + self._make_package(tmp, "shortorg", "shortrepo") + os.chdir(tmp) + with _force_rich_fallback(): + result = self.runner.invoke(cli, ["info", "shortrepo"]) + assert result.exit_code == 0 + assert "shortrepo" in result.output + + # -- package not found ------------------------------------------------ + + def test_info_package_not_found(self): + """``apm info`` shows error and lists available packages.""" + with self._chdir_tmp() as tmp: + self._make_package(tmp, "existorg", "existrepo") + os.chdir(tmp) + result = self.runner.invoke(cli, ["info", "doesnotexist"]) + assert result.exit_code == 1 + assert "not found" in result.output.lower() or "error" in result.output.lower() + assert "existorg/existrepo" in result.output + + # -- SKILL.md-only package (no apm.yml) -------------------------------- + + def test_info_skill_only_package(self): + """``apm info`` works for packages with SKILL.md but no apm.yml.""" + with self._chdir_tmp() as tmp: + pkg_dir = tmp / "apm_modules" / "skillorg" / "skillrepo" + pkg_dir.mkdir(parents=True) + (pkg_dir / "SKILL.md").write_text("# My Skill\n") + os.chdir(tmp) + with _force_rich_fallback(): + result = self.runner.invoke(cli, ["info", "skillorg/skillrepo"]) + assert result.exit_code == 0 + assert "skillrepo" in result.output + + # -- bare package (no context files / no workflows) ------------------- + + def test_info_bare_package_no_context(self): + """``apm info`` reports 'No context files found' for bare packages.""" + with self._chdir_tmp() as tmp: + self._make_package(tmp, "bareorg", "barerepo") + os.chdir(tmp) + with _force_rich_fallback(): + result = self.runner.invoke(cli, ["info", "bareorg/barerepo"]) + assert result.exit_code == 0 + assert "No context files found" in result.output + + def test_info_bare_package_no_workflows(self): + """``apm info`` reports 'No agent workflows found' for bare packages.""" + with self._chdir_tmp() as tmp: + self._make_package(tmp, "wforg", "wfrepo") + os.chdir(tmp) + with _force_rich_fallback(): + result = self.runner.invoke(cli, ["info", "wforg/wfrepo"]) + assert result.exit_code == 0 + assert "No agent workflows found" in result.output + + # -- no args: Click should show error / usage ------------------------- + + def test_info_no_args_shows_error(self): + """``apm info`` with no arguments shows an error (PACKAGE is required).""" + result = self.runner.invoke(cli, ["info"]) + # Click exits 2 for missing required arguments + assert result.exit_code == 2 + # Should mention the missing argument or show usage + assert "PACKAGE" in result.output or "Missing argument" in result.output or "Usage" in result.output + + # -- DependencyReference.parse failure for versions field ------------- + + def test_info_versions_invalid_parse(self): + """``apm info versions`` exits 1 when DependencyReference.parse raises ValueError.""" + with patch( + "apm_cli.commands.info.DependencyReference" + ) as mock_dep_ref_cls: + mock_dep_ref_cls.parse.side_effect = ValueError("unsupported host: ftp") + result = self.runner.invoke( + cli, ["info", "ftp://bad-host/invalid", "versions"] + ) + assert result.exit_code == 1 + assert "invalid" in result.output.lower() or "ftp" in result.output.lower() + + # -- path traversal prevention ---------------------------------------- + + def test_info_rejects_path_traversal(self): + """``apm info ../../../etc/passwd`` is rejected as a traversal attempt.""" + with self._chdir_tmp() as tmp: + self._make_package(tmp, "org", "legit") + os.chdir(tmp) + result = self.runner.invoke(cli, ["info", "../../../etc/passwd"]) + assert result.exit_code == 1 + assert "traversal" in result.output.lower() + + def test_info_rejects_dot_segment(self): + """``apm info org/../../../etc/passwd`` is rejected.""" + with self._chdir_tmp() as tmp: + self._make_package(tmp, "org", "legit") + os.chdir(tmp) + result = self.runner.invoke(cli, ["info", "org/../../../etc/passwd"]) + assert result.exit_code == 1 + assert "traversal" in result.output.lower() diff --git a/tests/unit/test_list_remote_refs.py b/tests/unit/test_list_remote_refs.py new file mode 100644 index 00000000..a1414431 --- /dev/null +++ b/tests/unit/test_list_remote_refs.py @@ -0,0 +1,469 @@ +"""Tests for GitHubPackageDownloader.list_remote_refs() and helpers.""" + +from unittest.mock import MagicMock, patch, PropertyMock + +import pytest +from git.exc import GitCommandError + +from apm_cli.deps.github_downloader import GitHubPackageDownloader +from apm_cli.models.dependency.reference import DependencyReference +from apm_cli.models.dependency.types import GitReferenceType, RemoteRef + + +# --------------------------------------------------------------------------- +# Fixtures +# --------------------------------------------------------------------------- + +SAMPLE_LS_REMOTE = ( + "aaa1111111111111111111111111111111111111\trefs/heads/main\n" + "bbb2222222222222222222222222222222222222\trefs/heads/feature/xyz\n" + "ccc3333333333333333333333333333333333333\trefs/tags/v1.0.0\n" + "ddd4444444444444444444444444444444444444\trefs/tags/v2.0.0\n" + "eee5555555555555555555555555555555555555\trefs/tags/v0.9.0\n" +) + +SAMPLE_LS_REMOTE_WITH_DEREF = ( + "tag1111111111111111111111111111111111111\trefs/tags/v1.0.0\n" + "com1111111111111111111111111111111111111\trefs/tags/v1.0.0^{}\n" + "tag2222222222222222222222222222222222222\trefs/tags/v2.0.0\n" + "com2222222222222222222222222222222222222\trefs/tags/v2.0.0^{}\n" + "aaa1111111111111111111111111111111111111\trefs/heads/main\n" +) + + +def _make_dep_ref(host=None, ado=False, artifactory=False, repo_url="owner/repo"): + """Build a minimal DependencyReference for testing.""" + kwargs = dict(repo_url=repo_url, host=host) + if ado: + kwargs.update( + host=host or "dev.azure.com", + ado_organization="myorg", + ado_project="myproj", + ado_repo="myrepo", + ) + if artifactory: + kwargs["artifactory_prefix"] = "artifactory/github" + return DependencyReference(**kwargs) + + +def _build_downloader(): + """Build a GitHubPackageDownloader with mocked auth.""" + with patch("apm_cli.deps.github_downloader.AuthResolver") as MockAuth: + mock_auth = MockAuth.return_value + mock_auth._token_manager = MagicMock() + mock_auth._token_manager.setup_environment.return_value = { + "PATH": "/usr/bin", + } + mock_auth._token_manager.get_token_for_purpose.return_value = None + mock_auth.build_error_context.return_value = "Check your auth setup." + downloader = GitHubPackageDownloader(auth_resolver=mock_auth) + return downloader + + +# --------------------------------------------------------------------------- +# _parse_ls_remote_output +# --------------------------------------------------------------------------- + +class TestParseLsRemoteOutput: + """Tests for the static ls-remote parser.""" + + def test_tags_and_branches(self): + refs = GitHubPackageDownloader._parse_ls_remote_output(SAMPLE_LS_REMOTE) + names = {r.name for r in refs} + assert "main" in names + assert "feature/xyz" in names + assert "v1.0.0" in names + assert "v2.0.0" in names + assert "v0.9.0" in names + + tag_refs = [r for r in refs if r.ref_type == GitReferenceType.TAG] + branch_refs = [r for r in refs if r.ref_type == GitReferenceType.BRANCH] + assert len(tag_refs) == 3 + assert len(branch_refs) == 2 + + def test_deref_handling(self): + """^{} lines should override tag-object SHAs with the commit SHA.""" + refs = GitHubPackageDownloader._parse_ls_remote_output(SAMPLE_LS_REMOTE_WITH_DEREF) + tag_map = {r.name: r.commit_sha for r in refs if r.ref_type == GitReferenceType.TAG} + # The ^{} commit SHA should be used, not the tag object SHA + assert tag_map["v1.0.0"] == "com1111111111111111111111111111111111111" + assert tag_map["v2.0.0"] == "com2222222222222222222222222222222222222" + # No ^{} entry should appear as a separate ref + assert "v1.0.0^{}" not in tag_map + assert "v2.0.0^{}" not in tag_map + + def test_empty_output(self): + assert GitHubPackageDownloader._parse_ls_remote_output("") == [] + + def test_blank_lines_ignored(self): + output = "\n\naaa1111111111111111111111111111111111111\trefs/heads/main\n\n" + refs = GitHubPackageDownloader._parse_ls_remote_output(output) + assert len(refs) == 1 + assert refs[0].name == "main" + + def test_malformed_lines_skipped(self): + output = ( + "no-tab-here\n" + "aaa1111111111111111111111111111111111111\trefs/heads/main\n" + ) + refs = GitHubPackageDownloader._parse_ls_remote_output(output) + assert len(refs) == 1 + + def test_tag_without_deref(self): + """A lightweight tag (no ^{} line) keeps its own SHA.""" + output = "abc1234567890123456789012345678901234567\trefs/tags/v3.0.0\n" + refs = GitHubPackageDownloader._parse_ls_remote_output(output) + assert len(refs) == 1 + assert refs[0].commit_sha == "abc1234567890123456789012345678901234567" + + def test_mixed_semver_and_non_semver_tags_parsed(self): + """Parser correctly identifies both semver and non-semver tag names.""" + output = ( + "aaa1111111111111111111111111111111111111\trefs/tags/v1.0.0\n" + "bbb2222222222222222222222222222222222222\trefs/tags/latest\n" + "ccc3333333333333333333333333333333333333\trefs/tags/stable\n" + "ddd4444444444444444444444444444444444444\trefs/heads/main\n" + ) + refs = GitHubPackageDownloader._parse_ls_remote_output(output) + names = {r.name for r in refs} + assert "v1.0.0" in names + assert "latest" in names + assert "stable" in names + assert "main" in names + tag_refs = [r for r in refs if r.ref_type == GitReferenceType.TAG] + assert len(tag_refs) == 3 + + +# --------------------------------------------------------------------------- +# _semver_sort_key / _sort_remote_refs +# --------------------------------------------------------------------------- + +class TestSorting: + """Tests for semver sorting logic.""" + + def test_semver_descending(self): + refs = [ + RemoteRef(name="v1.0.0", ref_type=GitReferenceType.TAG, commit_sha="a"), + RemoteRef(name="v2.0.0", ref_type=GitReferenceType.TAG, commit_sha="b"), + RemoteRef(name="v1.5.0", ref_type=GitReferenceType.TAG, commit_sha="c"), + ] + sorted_refs = GitHubPackageDownloader._sort_remote_refs(refs) + names = [r.name for r in sorted_refs] + assert names == ["v2.0.0", "v1.5.0", "v1.0.0"] + + def test_tags_before_branches(self): + refs = [ + RemoteRef(name="main", ref_type=GitReferenceType.BRANCH, commit_sha="a"), + RemoteRef(name="v1.0.0", ref_type=GitReferenceType.TAG, commit_sha="b"), + ] + sorted_refs = GitHubPackageDownloader._sort_remote_refs(refs) + assert sorted_refs[0].ref_type == GitReferenceType.TAG + assert sorted_refs[1].ref_type == GitReferenceType.BRANCH + + def test_branches_alphabetical(self): + refs = [ + RemoteRef(name="develop", ref_type=GitReferenceType.BRANCH, commit_sha="a"), + RemoteRef(name="alpha", ref_type=GitReferenceType.BRANCH, commit_sha="b"), + RemoteRef(name="main", ref_type=GitReferenceType.BRANCH, commit_sha="c"), + ] + sorted_refs = GitHubPackageDownloader._sort_remote_refs(refs) + names = [r.name for r in sorted_refs] + assert names == ["alpha", "develop", "main"] + + def test_non_semver_tags_after_semver(self): + refs = [ + RemoteRef(name="nightly", ref_type=GitReferenceType.TAG, commit_sha="a"), + RemoteRef(name="v1.0.0", ref_type=GitReferenceType.TAG, commit_sha="b"), + ] + sorted_refs = GitHubPackageDownloader._sort_remote_refs(refs) + names = [r.name for r in sorted_refs] + assert names == ["v1.0.0", "nightly"] + + def test_semver_without_v_prefix(self): + refs = [ + RemoteRef(name="1.0.0", ref_type=GitReferenceType.TAG, commit_sha="a"), + RemoteRef(name="2.0.0", ref_type=GitReferenceType.TAG, commit_sha="b"), + ] + sorted_refs = GitHubPackageDownloader._sort_remote_refs(refs) + names = [r.name for r in sorted_refs] + assert names == ["2.0.0", "1.0.0"] + + def test_semver_with_prerelease(self): + refs = [ + RemoteRef(name="v1.0.0-beta", ref_type=GitReferenceType.TAG, commit_sha="a"), + RemoteRef(name="v1.0.0", ref_type=GitReferenceType.TAG, commit_sha="b"), + RemoteRef(name="v1.0.0-alpha", ref_type=GitReferenceType.TAG, commit_sha="c"), + ] + sorted_refs = GitHubPackageDownloader._sort_remote_refs(refs) + names = [r.name for r in sorted_refs] + # Same major.minor.patch, prerelease suffixes sorted alphabetically + assert names[0] == "v1.0.0" # empty suffix sorts before "-alpha", "-beta" + + def test_empty_list(self): + assert GitHubPackageDownloader._sort_remote_refs([]) == [] + + def test_named_non_semver_tags_latest_stable(self): + """Common symbolic tag names ('latest', 'stable') sort after semver tags.""" + refs = [ + RemoteRef(name="latest", ref_type=GitReferenceType.TAG, commit_sha="a"), + RemoteRef(name="stable", ref_type=GitReferenceType.TAG, commit_sha="b"), + RemoteRef(name="v2.0.0", ref_type=GitReferenceType.TAG, commit_sha="c"), + RemoteRef(name="v1.0.0", ref_type=GitReferenceType.TAG, commit_sha="d"), + ] + sorted_refs = GitHubPackageDownloader._sort_remote_refs(refs) + # Semver tags must appear before non-semver tags + names = [r.name for r in sorted_refs] + assert names.index("v2.0.0") < names.index("latest") + assert names.index("v2.0.0") < names.index("stable") + assert names.index("v1.0.0") < names.index("latest") + assert names.index("v1.0.0") < names.index("stable") + + def test_all_non_semver_tags_sorted_alphabetically(self): + """When no semver tags exist, non-semver tags are sorted alphabetically.""" + refs = [ + RemoteRef(name="stable", ref_type=GitReferenceType.TAG, commit_sha="a"), + RemoteRef(name="latest", ref_type=GitReferenceType.TAG, commit_sha="b"), + RemoteRef(name="edge", ref_type=GitReferenceType.TAG, commit_sha="c"), + ] + sorted_refs = GitHubPackageDownloader._sort_remote_refs(refs) + names = [r.name for r in sorted_refs] + assert names == sorted(names) + + +# --------------------------------------------------------------------------- +# list_remote_refs -- Artifactory +# --------------------------------------------------------------------------- + +class TestListRemoteRefsArtifactory: + """Artifactory dependencies should return an empty list.""" + + def test_returns_empty(self): + dl = _build_downloader() + dep = _make_dep_ref(artifactory=True) + result = dl.list_remote_refs(dep) + assert result == [] + + +# --------------------------------------------------------------------------- +# list_remote_refs -- GitHub / generic (git ls-remote path) +# --------------------------------------------------------------------------- + +class TestListRemoteRefsGitHub: + """Tests for the git ls-remote code path.""" + + @patch("apm_cli.deps.github_downloader.git.cmd.Git") + def test_github_with_token(self, MockGitCmd): + """With a resolved token, uses locked-down env and authenticated URL.""" + dl = _build_downloader() + dep = _make_dep_ref(host="github.com") + + dl._resolve_dep_token = MagicMock(return_value="ghp_test_token") + dl._build_repo_url = MagicMock(return_value="https://x-access-token:ghp_test_token@github.com/owner/repo.git") + + mock_git = MockGitCmd.return_value + mock_git.ls_remote.return_value = SAMPLE_LS_REMOTE + + result = dl.list_remote_refs(dep) + + dl._resolve_dep_token.assert_called_once_with(dep) + dl._build_repo_url.assert_called_once_with( + "owner/repo", use_ssh=False, dep_ref=dep, token="ghp_test_token", + ) + mock_git.ls_remote.assert_called_once() + # Env should be the locked-down git_env (token present) + call_kwargs = mock_git.ls_remote.call_args + assert call_kwargs.kwargs.get("env") is dl.git_env + + # Result is sorted: tags first (descending), then branches alpha + tag_names = [r.name for r in result if r.ref_type == GitReferenceType.TAG] + branch_names = [r.name for r in result if r.ref_type == GitReferenceType.BRANCH] + assert tag_names == ["v2.0.0", "v1.0.0", "v0.9.0"] + assert branch_names == ["feature/xyz", "main"] + + @patch("apm_cli.deps.github_downloader.git.cmd.Git") + def test_generic_host_no_token(self, MockGitCmd): + """Generic host without token relaxes env (removes GIT_ASKPASS etc.).""" + dl = _build_downloader() + dep = _make_dep_ref(host="gitlab.example.com") + + dl._resolve_dep_token = MagicMock(return_value=None) + dl._build_repo_url = MagicMock(return_value="https://gitlab.example.com/owner/repo.git") + # Ensure git_env has the keys that should be removed + dl.git_env["GIT_ASKPASS"] = "echo" + dl.git_env["GIT_CONFIG_GLOBAL"] = "/dev/null" + dl.git_env["GIT_CONFIG_NOSYSTEM"] = "1" + + mock_git = MockGitCmd.return_value + mock_git.ls_remote.return_value = SAMPLE_LS_REMOTE + + dl.list_remote_refs(dep) + + call_kwargs = mock_git.ls_remote.call_args + used_env = call_kwargs.kwargs.get("env") + assert "GIT_ASKPASS" not in used_env + assert "GIT_CONFIG_GLOBAL" not in used_env + assert "GIT_CONFIG_NOSYSTEM" not in used_env + assert used_env.get("GIT_TERMINAL_PROMPT") == "0" + + @patch("apm_cli.deps.github_downloader.git.cmd.Git") + def test_git_command_error_raises_runtime_error(self, MockGitCmd): + """GitCommandError is wrapped in RuntimeError with auth context.""" + dl = _build_downloader() + dep = _make_dep_ref(host="github.com") + + dl._resolve_dep_token = MagicMock(return_value="ghp_token") + dl._build_repo_url = MagicMock(return_value="https://github.com/owner/repo.git") + + mock_git = MockGitCmd.return_value + mock_git.ls_remote.side_effect = GitCommandError("ls-remote", 128) + + with pytest.raises(RuntimeError, match="Failed to list remote refs"): + dl.list_remote_refs(dep) + + dl.auth_resolver.build_error_context.assert_called_once() + + @patch("apm_cli.deps.github_downloader.git.cmd.Git") + def test_deref_tags_use_commit_sha(self, MockGitCmd): + """Annotated tags use the commit SHA from the ^{} line.""" + dl = _build_downloader() + dep = _make_dep_ref(host="github.com") + + dl._resolve_dep_token = MagicMock(return_value="tok") + dl._build_repo_url = MagicMock(return_value="https://github.com/owner/repo.git") + + mock_git = MockGitCmd.return_value + mock_git.ls_remote.return_value = SAMPLE_LS_REMOTE_WITH_DEREF + + result = dl.list_remote_refs(dep) + tag_map = {r.name: r.commit_sha for r in result if r.ref_type == GitReferenceType.TAG} + assert tag_map["v1.0.0"] == "com1111111111111111111111111111111111111" + assert tag_map["v2.0.0"] == "com2222222222222222222222222222222222222" + + +# --------------------------------------------------------------------------- +# list_remote_refs -- Azure DevOps (git ls-remote path) +# --------------------------------------------------------------------------- + +class TestListRemoteRefsADO: + """Tests for ADO deps going through the unified git ls-remote path.""" + + @patch("apm_cli.deps.github_downloader.git.cmd.Git") + def test_ado_uses_git_ls_remote(self, MockGitCmd): + """ADO deps use git ls-remote, not a REST API call.""" + dl = _build_downloader() + dep = _make_dep_ref(ado=True) + + dl._resolve_dep_token = MagicMock(return_value="ado_pat_token") + dl._build_repo_url = MagicMock( + return_value="https://ado_pat_token@dev.azure.com/myorg/myproj/_git/myrepo", + ) + + mock_git = MockGitCmd.return_value + mock_git.ls_remote.return_value = SAMPLE_LS_REMOTE + + result = dl.list_remote_refs(dep) + + dl._resolve_dep_token.assert_called_once_with(dep) + dl._build_repo_url.assert_called_once_with( + "owner/repo", use_ssh=False, dep_ref=dep, token="ado_pat_token", + ) + mock_git.ls_remote.assert_called_once() + + # Verify sorted output + tag_names = [r.name for r in result if r.ref_type == GitReferenceType.TAG] + branch_names = [r.name for r in result if r.ref_type == GitReferenceType.BRANCH] + assert tag_names == ["v2.0.0", "v1.0.0", "v0.9.0"] + assert branch_names == ["feature/xyz", "main"] + + @patch("apm_cli.deps.github_downloader.git.cmd.Git") + def test_ado_git_error_raises_runtime_error(self, MockGitCmd): + """ADO git ls-remote failure is wrapped in RuntimeError with auth context.""" + dl = _build_downloader() + dep = _make_dep_ref(ado=True) + + dl._resolve_dep_token = MagicMock(return_value="ado_pat") + dl._build_repo_url = MagicMock( + return_value="https://ado_pat@dev.azure.com/myorg/myproj/_git/myrepo", + ) + + mock_git = MockGitCmd.return_value + mock_git.ls_remote.side_effect = GitCommandError("ls-remote", 128) + + with pytest.raises(RuntimeError, match="Failed to list remote refs"): + dl.list_remote_refs(dep) + + dl.auth_resolver.build_error_context.assert_called_once() + + +# --------------------------------------------------------------------------- +# Auth token resolution +# --------------------------------------------------------------------------- + +class TestAuthTokenResolution: + """Verify correct token resolution per host type.""" + + @patch("apm_cli.deps.github_downloader.git.cmd.Git") + def test_github_host_resolves_token(self, MockGitCmd): + dl = _build_downloader() + dep = _make_dep_ref(host="github.com") + dl._resolve_dep_token = MagicMock(return_value="ghp_tok") + dl._build_repo_url = MagicMock(return_value="https://github.com/o/r.git") + MockGitCmd.return_value.ls_remote.return_value = "" + + dl.list_remote_refs(dep) + + dl._resolve_dep_token.assert_called_once_with(dep) + + @patch("apm_cli.deps.github_downloader.git.cmd.Git") + def test_ado_host_resolves_token(self, MockGitCmd): + dl = _build_downloader() + dep = _make_dep_ref(ado=True) + dl._resolve_dep_token = MagicMock(return_value="ado_tok") + dl._build_repo_url = MagicMock( + return_value="https://ado_tok@dev.azure.com/myorg/myproj/_git/myrepo", + ) + MockGitCmd.return_value.ls_remote.return_value = "" + + dl.list_remote_refs(dep) + + dl._resolve_dep_token.assert_called_once_with(dep) + + @patch("apm_cli.deps.github_downloader.git.cmd.Git") + def test_generic_host_returns_none_token(self, MockGitCmd): + dl = _build_downloader() + dep = _make_dep_ref(host="gitlab.example.com") + dl._resolve_dep_token = MagicMock(return_value=None) + dl._build_repo_url = MagicMock(return_value="https://gitlab.example.com/o/r.git") + MockGitCmd.return_value.ls_remote.return_value = "" + + dl.list_remote_refs(dep) + + dl._resolve_dep_token.assert_called_once_with(dep) + # _build_repo_url should receive token=None for generic hosts + dl._build_repo_url.assert_called_once_with( + "owner/repo", use_ssh=False, dep_ref=dep, token=None, + ) + + +# --------------------------------------------------------------------------- +# RemoteRef dataclass basics +# --------------------------------------------------------------------------- + +class TestRemoteRefDataclass: + """Smoke tests for the RemoteRef dataclass.""" + + def test_fields(self): + r = RemoteRef(name="v1.0.0", ref_type=GitReferenceType.TAG, commit_sha="abc") + assert r.name == "v1.0.0" + assert r.ref_type == GitReferenceType.TAG + assert r.commit_sha == "abc" + + def test_equality(self): + a = RemoteRef(name="main", ref_type=GitReferenceType.BRANCH, commit_sha="x") + b = RemoteRef(name="main", ref_type=GitReferenceType.BRANCH, commit_sha="x") + assert a == b + + def test_import_from_apm_package(self): + """RemoteRef should be importable via the backward-compat re-export path.""" + from apm_cli.models.apm_package import RemoteRef as Imported + assert Imported is RemoteRef diff --git a/tests/unit/test_outdated_command.py b/tests/unit/test_outdated_command.py new file mode 100644 index 00000000..b3f23001 --- /dev/null +++ b/tests/unit/test_outdated_command.py @@ -0,0 +1,982 @@ +"""Unit tests for the ``apm outdated`` command.""" + +import contextlib +import os +import tempfile +from pathlib import Path +from unittest.mock import MagicMock, patch + +import pytest +from click.testing import CliRunner + +from apm_cli.cli import cli +from apm_cli.deps.lockfile import LockedDependency, LockFile +from apm_cli.models.dependency.types import GitReferenceType, RemoteRef + + +# --------------------------------------------------------------------------- +# Patch targets -- imports are lazy (inside function body), so we patch +# at the source module level. +# --------------------------------------------------------------------------- +_PATCH_LOCKFILE = "apm_cli.deps.lockfile.LockFile" +_PATCH_GET_LOCKFILE_PATH = "apm_cli.deps.lockfile.get_lockfile_path" +_PATCH_MIGRATE = "apm_cli.deps.lockfile.migrate_lockfile_if_needed" +_PATCH_DOWNLOADER = "apm_cli.deps.github_downloader.GitHubPackageDownloader" +_PATCH_AUTH = "apm_cli.core.auth.AuthResolver" +_PATCH_GET_APM_DIR = "apm_cli.core.scope.get_apm_dir" + + +# --------------------------------------------------------------------------- +# Helpers +# --------------------------------------------------------------------------- +def _locked_dep( + repo_url="org/pkg", + host=None, + resolved_ref="v1.0.0", + resolved_commit="aaa", + source=None, + registry_prefix=None, +): + """Build a LockedDependency with sensible defaults.""" + return LockedDependency( + repo_url=repo_url, + host=host, + resolved_ref=resolved_ref, + resolved_commit=resolved_commit, + source=source, + registry_prefix=registry_prefix, + ) + + +def _remote_tag(name, sha="abc123"): + """Build a RemoteRef tag.""" + return RemoteRef(name=name, ref_type=GitReferenceType.TAG, commit_sha=sha) + + +def _remote_branch(name, sha="abc123"): + """Build a RemoteRef branch.""" + return RemoteRef(name=name, ref_type=GitReferenceType.BRANCH, commit_sha=sha) + + +def _make_lockfile(deps_dict): + """Create a LockFile with the given dependencies dict.""" + lf = LockFile() + lf.dependencies = deps_dict + return lf + + +# --------------------------------------------------------------------------- +# Test class +# --------------------------------------------------------------------------- +class TestOutdatedCommand: + """Tests for ``apm outdated``.""" + + def setup_method(self): + self.runner = CliRunner() + try: + self.original_dir = os.getcwd() + except FileNotFoundError: + self.original_dir = str(Path(__file__).parent.parent.parent) + os.chdir(self.original_dir) + + def teardown_method(self): + try: + os.chdir(self.original_dir) + except (FileNotFoundError, OSError): + repo_root = Path(__file__).parent.parent.parent + os.chdir(str(repo_root)) + + @contextlib.contextmanager + def _chdir_tmp(self): + """Create a temp dir, chdir into it, restore CWD on exit.""" + with tempfile.TemporaryDirectory() as tmp_dir: + try: + os.chdir(tmp_dir) + yield Path(tmp_dir) + finally: + os.chdir(self.original_dir) + + # --- No lockfile --- + + @patch(_PATCH_MIGRATE) + @patch(_PATCH_GET_LOCKFILE_PATH) + @patch(_PATCH_GET_APM_DIR) + @patch(_PATCH_LOCKFILE) + def test_no_lockfile_exits_1( + self, mock_lf_cls, mock_get_apm_dir, mock_get_path, mock_migrate + ): + """Exit 1 with error when no lockfile exists.""" + with self._chdir_tmp() as tmp: + mock_get_apm_dir.return_value = tmp + mock_get_path.return_value = tmp / "apm.lock.yaml" + mock_lf_cls.read.return_value = None + + result = self.runner.invoke(cli, ["outdated"]) + + assert result.exit_code == 1 + assert "No lockfile" in result.output + + # --- Empty lockfile (no deps) --- + + @patch(_PATCH_MIGRATE) + @patch(_PATCH_GET_LOCKFILE_PATH) + @patch(_PATCH_GET_APM_DIR) + @patch(_PATCH_LOCKFILE) + def test_empty_lockfile_success( + self, mock_lf_cls, mock_get_apm_dir, mock_get_path, mock_migrate + ): + """Success message when lockfile has zero dependencies.""" + with self._chdir_tmp() as tmp: + mock_get_apm_dir.return_value = tmp + mock_get_path.return_value = tmp / "apm.lock.yaml" + mock_lf_cls.read.return_value = _make_lockfile({}) + + result = self.runner.invoke(cli, ["outdated"]) + + assert result.exit_code == 0 + assert "No locked dependencies" in result.output + + # --- All deps up-to-date --- + + @patch(_PATCH_AUTH) + @patch(_PATCH_DOWNLOADER) + @patch(_PATCH_MIGRATE) + @patch(_PATCH_GET_LOCKFILE_PATH) + @patch(_PATCH_GET_APM_DIR) + @patch(_PATCH_LOCKFILE) + def test_all_up_to_date( + self, mock_lf_cls, mock_get_apm_dir, mock_get_path, + mock_migrate, mock_dl_cls, mock_auth, + ): + """Show success message when all deps are at latest tag.""" + with self._chdir_tmp() as tmp: + mock_get_apm_dir.return_value = tmp + mock_get_path.return_value = tmp / "apm.lock.yaml" + + deps = { + "org/alpha": _locked_dep("org/alpha", resolved_ref="v2.0.0"), + "org/beta": _locked_dep("org/beta", resolved_ref="v1.5.0"), + } + mock_lf_cls.read.return_value = _make_lockfile(deps) + + mock_downloader = MagicMock() + mock_dl_cls.return_value = mock_downloader + # Both repos: latest tag == locked tag + mock_downloader.list_remote_refs.side_effect = [ + [_remote_tag("v2.0.0")], + [_remote_tag("v1.5.0")], + ] + + result = self.runner.invoke(cli, ["outdated"]) + + assert result.exit_code == 0 + assert "up-to-date" in result.output.lower() + + # --- Some deps outdated --- + + @patch(_PATCH_AUTH) + @patch(_PATCH_DOWNLOADER) + @patch(_PATCH_MIGRATE) + @patch(_PATCH_GET_LOCKFILE_PATH) + @patch(_PATCH_GET_APM_DIR) + @patch(_PATCH_LOCKFILE) + def test_some_outdated_shows_table( + self, mock_lf_cls, mock_get_apm_dir, mock_get_path, + mock_migrate, mock_dl_cls, mock_auth, + ): + """Table is shown when some deps are outdated; exit code is still 0.""" + with self._chdir_tmp() as tmp: + mock_get_apm_dir.return_value = tmp + mock_get_path.return_value = tmp / "apm.lock.yaml" + + deps = { + "org/alpha": _locked_dep("org/alpha", resolved_ref="v1.0.0"), + "org/beta": _locked_dep("org/beta", resolved_ref="v2.0.0"), + } + mock_lf_cls.read.return_value = _make_lockfile(deps) + + mock_downloader = MagicMock() + mock_dl_cls.return_value = mock_downloader + mock_downloader.list_remote_refs.side_effect = [ + [_remote_tag("v2.0.0"), _remote_tag("v1.0.0")], # alpha outdated + [_remote_tag("v2.0.0")], # beta up-to-date + ] + + result = self.runner.invoke(cli, ["outdated"]) + + assert result.exit_code == 0 + assert "org/alpha" in result.output + assert "v1.0.0" in result.output + assert "v2.0.0" in result.output + assert "outdated" in result.output.lower() + + # --- Branch ref (SHA-based comparison) --- + + @patch(_PATCH_AUTH) + @patch(_PATCH_DOWNLOADER) + @patch(_PATCH_MIGRATE) + @patch(_PATCH_GET_LOCKFILE_PATH) + @patch(_PATCH_GET_APM_DIR) + @patch(_PATCH_LOCKFILE) + def test_branch_ref_outdated_when_sha_differs( + self, mock_lf_cls, mock_get_apm_dir, mock_get_path, + mock_migrate, mock_dl_cls, mock_auth, + ): + """Branch-pinned dep is outdated when locked SHA differs from remote tip.""" + with self._chdir_tmp() as tmp: + mock_get_apm_dir.return_value = tmp + mock_get_path.return_value = tmp / "apm.lock.yaml" + + deps = { + "org/branch-pkg": _locked_dep( + "org/branch-pkg", resolved_ref="main", resolved_commit="old_sha" + ), + } + mock_lf_cls.read.return_value = _make_lockfile(deps) + + mock_downloader = MagicMock() + mock_dl_cls.return_value = mock_downloader + mock_downloader.list_remote_refs.return_value = [ + _remote_branch("main", sha="new_sha_abc123"), + ] + + result = self.runner.invoke(cli, ["outdated"]) + + assert result.exit_code == 0 + assert "outdated" in result.output.lower() + assert "org/branch-pkg" in result.output + mock_downloader.list_remote_refs.assert_called_once() + + @patch(_PATCH_AUTH) + @patch(_PATCH_DOWNLOADER) + @patch(_PATCH_MIGRATE) + @patch(_PATCH_GET_LOCKFILE_PATH) + @patch(_PATCH_GET_APM_DIR) + @patch(_PATCH_LOCKFILE) + def test_branch_ref_up_to_date_when_sha_matches( + self, mock_lf_cls, mock_get_apm_dir, mock_get_path, + mock_migrate, mock_dl_cls, mock_auth, + ): + """Branch-pinned dep is up-to-date when locked SHA matches remote tip.""" + with self._chdir_tmp() as tmp: + mock_get_apm_dir.return_value = tmp + mock_get_path.return_value = tmp / "apm.lock.yaml" + + deps = { + "org/branch-pkg": _locked_dep( + "org/branch-pkg", resolved_ref="main", resolved_commit="same_sha" + ), + } + mock_lf_cls.read.return_value = _make_lockfile(deps) + + mock_downloader = MagicMock() + mock_dl_cls.return_value = mock_downloader + mock_downloader.list_remote_refs.return_value = [ + _remote_branch("main", sha="same_sha"), + ] + + result = self.runner.invoke(cli, ["outdated"]) + + assert result.exit_code == 0 + # Should report all up-to-date (success message) + assert "up-to-date" in result.output.lower() or "up to date" in result.output.lower() + + # --- Commit ref (unknown status — no matching branch) --- + + @patch(_PATCH_AUTH) + @patch(_PATCH_DOWNLOADER) + @patch(_PATCH_MIGRATE) + @patch(_PATCH_GET_LOCKFILE_PATH) + @patch(_PATCH_GET_APM_DIR) + @patch(_PATCH_LOCKFILE) + def test_commit_ref_shown_as_unknown( + self, mock_lf_cls, mock_get_apm_dir, mock_get_path, + mock_migrate, mock_dl_cls, mock_auth, + ): + """Deps locked to a commit SHA show 'unknown' when ref is a raw SHA.""" + with self._chdir_tmp() as tmp: + mock_get_apm_dir.return_value = tmp + mock_get_path.return_value = tmp / "apm.lock.yaml" + + deps = { + "org/commit-pkg": _locked_dep( + "org/commit-pkg", + resolved_ref="abc1234567890def1234567890abc1234567890de", + ), + } + mock_lf_cls.read.return_value = _make_lockfile(deps) + + mock_downloader = MagicMock() + mock_dl_cls.return_value = mock_downloader + # No branch matches the 40-char hex ref name + mock_downloader.list_remote_refs.return_value = [ + _remote_branch("main", sha="xyz999"), + ] + + result = self.runner.invoke(cli, ["outdated"]) + + assert result.exit_code == 0 + assert "unknown" in result.output.lower() + assert "org/commit-pkg" in result.output + + # --- Local dep skipped --- + + @patch(_PATCH_AUTH) + @patch(_PATCH_DOWNLOADER) + @patch(_PATCH_MIGRATE) + @patch(_PATCH_GET_LOCKFILE_PATH) + @patch(_PATCH_GET_APM_DIR) + @patch(_PATCH_LOCKFILE) + def test_local_dep_skipped( + self, mock_lf_cls, mock_get_apm_dir, mock_get_path, + mock_migrate, mock_dl_cls, mock_auth, + ): + """Local deps (source='local') should be skipped entirely.""" + with self._chdir_tmp() as tmp: + mock_get_apm_dir.return_value = tmp + mock_get_path.return_value = tmp / "apm.lock.yaml" + + deps = { + "./local/pkg": _locked_dep( + "./local/pkg", resolved_ref="v1.0.0", source="local" + ), + "org/remote": _locked_dep("org/remote", resolved_ref="v1.0.0"), + } + mock_lf_cls.read.return_value = _make_lockfile(deps) + + mock_downloader = MagicMock() + mock_dl_cls.return_value = mock_downloader + mock_downloader.list_remote_refs.return_value = [_remote_tag("v1.0.0")] + + result = self.runner.invoke(cli, ["outdated"]) + + assert result.exit_code == 0 + # Local dep should not appear in output + assert "./local/pkg" not in result.output + # Only one call for the remote dep + assert mock_downloader.list_remote_refs.call_count == 1 + + # --- Artifactory dep skipped --- + + @patch(_PATCH_AUTH) + @patch(_PATCH_DOWNLOADER) + @patch(_PATCH_MIGRATE) + @patch(_PATCH_GET_LOCKFILE_PATH) + @patch(_PATCH_GET_APM_DIR) + @patch(_PATCH_LOCKFILE) + def test_artifactory_dep_skipped( + self, mock_lf_cls, mock_get_apm_dir, mock_get_path, + mock_migrate, mock_dl_cls, mock_auth, + ): + """Artifactory deps (registry_prefix set) should be skipped.""" + with self._chdir_tmp() as tmp: + mock_get_apm_dir.return_value = tmp + mock_get_path.return_value = tmp / "apm.lock.yaml" + + deps = { + "art/pkg": _locked_dep( + "art/pkg", + resolved_ref="v1.0.0", + registry_prefix="artifactory/github", + ), + "org/remote": _locked_dep("org/remote", resolved_ref="v1.0.0"), + } + mock_lf_cls.read.return_value = _make_lockfile(deps) + + mock_downloader = MagicMock() + mock_dl_cls.return_value = mock_downloader + mock_downloader.list_remote_refs.return_value = [_remote_tag("v1.0.0")] + + result = self.runner.invoke(cli, ["outdated"]) + + assert result.exit_code == 0 + assert "art/pkg" not in result.output + assert mock_downloader.list_remote_refs.call_count == 1 + + # --- Error fetching refs for one dep (graceful degradation) --- + + @patch(_PATCH_AUTH) + @patch(_PATCH_DOWNLOADER) + @patch(_PATCH_MIGRATE) + @patch(_PATCH_GET_LOCKFILE_PATH) + @patch(_PATCH_GET_APM_DIR) + @patch(_PATCH_LOCKFILE) + def test_error_fetching_refs_shows_unknown( + self, mock_lf_cls, mock_get_apm_dir, mock_get_path, + mock_migrate, mock_dl_cls, mock_auth, + ): + """When list_remote_refs raises for one dep, show 'unknown' and continue.""" + with self._chdir_tmp() as tmp: + mock_get_apm_dir.return_value = tmp + mock_get_path.return_value = tmp / "apm.lock.yaml" + + deps = { + "org/fail-pkg": _locked_dep("org/fail-pkg", resolved_ref="v1.0.0"), + "org/ok-pkg": _locked_dep("org/ok-pkg", resolved_ref="v1.0.0"), + } + mock_lf_cls.read.return_value = _make_lockfile(deps) + + mock_downloader = MagicMock() + mock_dl_cls.return_value = mock_downloader + mock_downloader.list_remote_refs.side_effect = [ + RuntimeError("auth failed"), # First dep errors + [_remote_tag("v2.0.0"), _remote_tag("v1.0.0")], # Second succeeds + ] + + result = self.runner.invoke(cli, ["outdated"]) + + assert result.exit_code == 0 + assert "org/fail-pkg" in result.output + assert "unknown" in result.output.lower() + # Second dep should still be processed + assert "org/ok-pkg" in result.output + assert mock_downloader.list_remote_refs.call_count == 2 + + # --- No remote tags found --- + + @patch(_PATCH_AUTH) + @patch(_PATCH_DOWNLOADER) + @patch(_PATCH_MIGRATE) + @patch(_PATCH_GET_LOCKFILE_PATH) + @patch(_PATCH_GET_APM_DIR) + @patch(_PATCH_LOCKFILE) + def test_no_remote_tags_shows_unknown( + self, mock_lf_cls, mock_get_apm_dir, mock_get_path, + mock_migrate, mock_dl_cls, mock_auth, + ): + """When no remote tags are found, status should be 'unknown'.""" + with self._chdir_tmp() as tmp: + mock_get_apm_dir.return_value = tmp + mock_get_path.return_value = tmp / "apm.lock.yaml" + + deps = { + "org/notags": _locked_dep("org/notags", resolved_ref="v1.0.0"), + } + mock_lf_cls.read.return_value = _make_lockfile(deps) + + mock_downloader = MagicMock() + mock_dl_cls.return_value = mock_downloader + # Only branches returned, no tags + mock_downloader.list_remote_refs.return_value = [ + _remote_branch("main"), + _remote_branch("develop"), + ] + + result = self.runner.invoke(cli, ["outdated"]) + + assert result.exit_code == 0 + assert "unknown" in result.output.lower() + + # --- --global flag --- + + @patch(_PATCH_AUTH) + @patch(_PATCH_DOWNLOADER) + @patch(_PATCH_MIGRATE) + @patch(_PATCH_GET_LOCKFILE_PATH) + @patch(_PATCH_GET_APM_DIR) + @patch(_PATCH_LOCKFILE) + def test_global_flag_uses_user_scope( + self, mock_lf_cls, mock_get_apm_dir, mock_get_path, + mock_migrate, mock_dl_cls, mock_auth, + ): + """--global should resolve scope to USER (~/.apm/).""" + with self._chdir_tmp() as tmp: + user_apm = tmp / ".apm" + user_apm.mkdir() + mock_get_apm_dir.return_value = user_apm + mock_get_path.return_value = user_apm / "apm.lock.yaml" + mock_lf_cls.read.return_value = _make_lockfile({}) + + result = self.runner.invoke(cli, ["outdated", "--global"]) + + assert result.exit_code == 0 + # Verify get_apm_dir was called (scope is passed internally) + mock_get_apm_dir.assert_called_once() + + # --- --verbose flag shows tags --- + + @patch(_PATCH_AUTH) + @patch(_PATCH_DOWNLOADER) + @patch(_PATCH_MIGRATE) + @patch(_PATCH_GET_LOCKFILE_PATH) + @patch(_PATCH_GET_APM_DIR) + @patch(_PATCH_LOCKFILE) + def test_verbose_shows_tags( + self, mock_lf_cls, mock_get_apm_dir, mock_get_path, + mock_migrate, mock_dl_cls, mock_auth, + ): + """--verbose should include available tags for outdated deps.""" + with self._chdir_tmp() as tmp: + mock_get_apm_dir.return_value = tmp + mock_get_path.return_value = tmp / "apm.lock.yaml" + + deps = { + "org/pkg": _locked_dep("org/pkg", resolved_ref="v1.0.0"), + } + mock_lf_cls.read.return_value = _make_lockfile(deps) + + mock_downloader = MagicMock() + mock_dl_cls.return_value = mock_downloader + mock_downloader.list_remote_refs.return_value = [ + _remote_tag("v3.0.0"), + _remote_tag("v2.0.0"), + _remote_tag("v1.0.0"), + ] + + result = self.runner.invoke(cli, ["outdated", "--verbose"]) + + assert result.exit_code == 0 + # Should include tag listing in output + assert "v3.0.0" in result.output + assert "v2.0.0" in result.output + + # --- Mixed scenario: local + remote up-to-date + remote outdated + error --- + + @patch(_PATCH_AUTH) + @patch(_PATCH_DOWNLOADER) + @patch(_PATCH_MIGRATE) + @patch(_PATCH_GET_LOCKFILE_PATH) + @patch(_PATCH_GET_APM_DIR) + @patch(_PATCH_LOCKFILE) + def test_mixed_scenario( + self, mock_lf_cls, mock_get_apm_dir, mock_get_path, + mock_migrate, mock_dl_cls, mock_auth, + ): + """Mix of local, up-to-date, outdated, and error deps.""" + with self._chdir_tmp() as tmp: + mock_get_apm_dir.return_value = tmp + mock_get_path.return_value = tmp / "apm.lock.yaml" + + deps = { + "./my-local": _locked_dep("./my-local", source="local"), + "org/current": _locked_dep("org/current", resolved_ref="v3.0.0"), + "org/stale": _locked_dep("org/stale", resolved_ref="v1.0.0"), + "org/broken": _locked_dep("org/broken", resolved_ref="v1.0.0"), + } + mock_lf_cls.read.return_value = _make_lockfile(deps) + + mock_downloader = MagicMock() + mock_dl_cls.return_value = mock_downloader + mock_downloader.list_remote_refs.side_effect = [ + [_remote_tag("v3.0.0")], # current: up-to-date + [_remote_tag("v2.0.0"), _remote_tag("v1.0.0")], # stale: outdated + RuntimeError("network error"), # broken: error + ] + + result = self.runner.invoke(cli, ["outdated"]) + + assert result.exit_code == 0 + # Local dep not in output + assert "./my-local" not in result.output + # Current dep in output + assert "org/current" in result.output + # Stale dep in output with outdated status + assert "org/stale" in result.output + # Broken dep in output with unknown + assert "org/broken" in result.output + + # --- Dep with no resolved_ref (default branch comparison) --- + + @patch(_PATCH_AUTH) + @patch(_PATCH_DOWNLOADER) + @patch(_PATCH_MIGRATE) + @patch(_PATCH_GET_LOCKFILE_PATH) + @patch(_PATCH_GET_APM_DIR) + @patch(_PATCH_LOCKFILE) + def test_no_resolved_ref_compares_against_default_branch( + self, mock_lf_cls, mock_get_apm_dir, mock_get_path, + mock_migrate, mock_dl_cls, mock_auth, + ): + """Dep with no resolved_ref compares SHA against default branch tip.""" + with self._chdir_tmp() as tmp: + mock_get_apm_dir.return_value = tmp + mock_get_path.return_value = tmp / "apm.lock.yaml" + + deps = { + "org/noref": _locked_dep( + "org/noref", resolved_ref=None, resolved_commit="old_sha" + ), + } + mock_lf_cls.read.return_value = _make_lockfile(deps) + + mock_downloader = MagicMock() + mock_dl_cls.return_value = mock_downloader + mock_downloader.list_remote_refs.return_value = [ + _remote_branch("main", sha="new_sha_def456"), + ] + + result = self.runner.invoke(cli, ["outdated"]) + + assert result.exit_code == 0 + assert "outdated" in result.output.lower() + mock_downloader.list_remote_refs.assert_called_once() + + @patch(_PATCH_AUTH) + @patch(_PATCH_DOWNLOADER) + @patch(_PATCH_MIGRATE) + @patch(_PATCH_GET_LOCKFILE_PATH) + @patch(_PATCH_GET_APM_DIR) + @patch(_PATCH_LOCKFILE) + def test_no_resolved_ref_no_branches_shows_unknown( + self, mock_lf_cls, mock_get_apm_dir, mock_get_path, + mock_migrate, mock_dl_cls, mock_auth, + ): + """Dep with no resolved_ref and no branches returns unknown.""" + with self._chdir_tmp() as tmp: + mock_get_apm_dir.return_value = tmp + mock_get_path.return_value = tmp / "apm.lock.yaml" + + deps = { + "org/noref": _locked_dep( + "org/noref", resolved_ref=None, resolved_commit="old_sha" + ), + } + mock_lf_cls.read.return_value = _make_lockfile(deps) + + mock_downloader = MagicMock() + mock_dl_cls.return_value = mock_downloader + mock_downloader.list_remote_refs.return_value = [] + + result = self.runner.invoke(cli, ["outdated"]) + + assert result.exit_code == 0 + assert "unknown" in result.output.lower() + + # --- No lockfile with --global --- + + @patch(_PATCH_MIGRATE) + @patch(_PATCH_GET_LOCKFILE_PATH) + @patch(_PATCH_GET_APM_DIR) + @patch(_PATCH_LOCKFILE) + def test_no_lockfile_global_exits_1( + self, mock_lf_cls, mock_get_apm_dir, mock_get_path, mock_migrate, + ): + """--global with no lockfile exits 1 with user-scope hint.""" + with self._chdir_tmp() as tmp: + mock_get_apm_dir.return_value = tmp + mock_get_path.return_value = tmp / "apm.lock.yaml" + mock_lf_cls.read.return_value = None + + result = self.runner.invoke(cli, ["outdated", "--global"]) + + assert result.exit_code == 1 + assert "~/.apm/" in result.output + + # --- Virtual package deps --- + + @patch(_PATCH_AUTH) + @patch(_PATCH_DOWNLOADER) + @patch(_PATCH_MIGRATE) + @patch(_PATCH_GET_LOCKFILE_PATH) + @patch(_PATCH_GET_APM_DIR) + @patch(_PATCH_LOCKFILE) + def test_virtual_dep_processed_normally( + self, mock_lf_cls, mock_get_apm_dir, mock_get_path, + mock_migrate, mock_dl_cls, mock_auth, + ): + """Virtual package deps are not skipped; their parent repo tags are fetched.""" + with self._chdir_tmp() as tmp: + mock_get_apm_dir.return_value = tmp + mock_get_path.return_value = tmp / "apm.lock.yaml" + + virtual_dep = LockedDependency( + repo_url="org/pkg", + resolved_ref="v1.0.0", + resolved_commit="abc", + is_virtual=True, + virtual_path="prompts/my.prompt.md", + ) + # get_unique_key() for virtual deps returns "org/pkg/prompts/my.prompt.md" + deps = {"org/pkg/prompts/my.prompt.md": virtual_dep} + mock_lf_cls.read.return_value = _make_lockfile(deps) + + mock_downloader = MagicMock() + mock_downloader.list_remote_refs.return_value = [ + _remote_tag("v1.0.0"), + ] + mock_dl_cls.return_value = mock_downloader + + result = self.runner.invoke(cli, ["outdated"]) + + assert result.exit_code == 0 + # Remote refs were fetched -- virtual deps are NOT silently skipped + mock_downloader.list_remote_refs.assert_called_once() + + # --- Dev dependency visibility --- + + @patch(_PATCH_AUTH) + @patch(_PATCH_DOWNLOADER) + @patch(_PATCH_MIGRATE) + @patch(_PATCH_GET_LOCKFILE_PATH) + @patch(_PATCH_GET_APM_DIR) + @patch(_PATCH_LOCKFILE) + def test_dev_dep_included_in_output( + self, mock_lf_cls, mock_get_apm_dir, mock_get_path, + mock_migrate, mock_dl_cls, mock_auth, + ): + """Dev dependencies (is_dev=True) are included in the outdated check.""" + with self._chdir_tmp() as tmp: + mock_get_apm_dir.return_value = tmp + mock_get_path.return_value = tmp / "apm.lock.yaml" + + dev_dep = LockedDependency( + repo_url="org/devpkg", + resolved_ref="v1.0.0", + resolved_commit="abc", + is_dev=True, + ) + deps = {"org/devpkg": dev_dep} + mock_lf_cls.read.return_value = _make_lockfile(deps) + + mock_downloader = MagicMock() + mock_downloader.list_remote_refs.return_value = [ + _remote_tag("v2.0.0"), + _remote_tag("v1.0.0"), + ] + mock_dl_cls.return_value = mock_downloader + + result = self.runner.invoke(cli, ["outdated"]) + + assert result.exit_code == 0 + # Dev dep must appear in the output (is_dev does not suppress it) + assert "org/devpkg" in result.output + # And its status should be outdated since v2.0.0 > v1.0.0 + assert "outdated" in result.output.lower() + + # --- Multiple packages with same version --- + + @patch(_PATCH_AUTH) + @patch(_PATCH_DOWNLOADER) + @patch(_PATCH_MIGRATE) + @patch(_PATCH_GET_LOCKFILE_PATH) + @patch(_PATCH_GET_APM_DIR) + @patch(_PATCH_LOCKFILE) + def test_multiple_packages_same_version_all_shown( + self, mock_lf_cls, mock_get_apm_dir, mock_get_path, + mock_migrate, mock_dl_cls, mock_auth, + ): + """Two packages pinned to the same version both appear in output (no dedup).""" + with self._chdir_tmp() as tmp: + mock_get_apm_dir.return_value = tmp + mock_get_path.return_value = tmp / "apm.lock.yaml" + + deps = { + "org/alpha": _locked_dep("org/alpha", resolved_ref="v2.0.0"), + "org/beta": _locked_dep("org/beta", resolved_ref="v2.0.0"), + } + mock_lf_cls.read.return_value = _make_lockfile(deps) + + # Both are already at the latest v2.0.0 + mock_downloader = MagicMock() + mock_downloader.list_remote_refs.return_value = [ + _remote_tag("v2.0.0"), + _remote_tag("v1.0.0"), + ] + mock_dl_cls.return_value = mock_downloader + + result = self.runner.invoke(cli, ["outdated"]) + + assert result.exit_code == 0 + # Both packages must be checked (list_remote_refs called twice) + assert mock_downloader.list_remote_refs.call_count == 2 + + # --- Parallel checks --- + + @patch(_PATCH_AUTH) + @patch(_PATCH_DOWNLOADER) + @patch(_PATCH_MIGRATE) + @patch(_PATCH_GET_LOCKFILE_PATH) + @patch(_PATCH_GET_APM_DIR) + @patch(_PATCH_LOCKFILE) + def test_parallel_checks_default( + self, mock_lf_cls, mock_get_apm_dir, mock_get_path, + mock_migrate, mock_dl_cls, mock_auth, + ): + """Default parallel-checks=4 should still check all deps.""" + with self._chdir_tmp() as tmp: + mock_get_apm_dir.return_value = tmp + mock_get_path.return_value = tmp / "apm.lock.yaml" + + deps = { + f"org/pkg{i}": _locked_dep( + f"org/pkg{i}", resolved_ref=None, + resolved_commit="aaa", + ) + for i in range(6) + } + mock_lf_cls.read.return_value = _make_lockfile(deps) + + mock_downloader = MagicMock() + mock_downloader.list_remote_refs.return_value = [ + _remote_branch("main", sha="aaa"), + ] + mock_dl_cls.return_value = mock_downloader + + result = self.runner.invoke(cli, ["outdated"]) + + assert result.exit_code == 0 + assert "up-to-date" in result.output.lower() + assert mock_downloader.list_remote_refs.call_count == 6 + + @patch(_PATCH_AUTH) + @patch(_PATCH_DOWNLOADER) + @patch(_PATCH_MIGRATE) + @patch(_PATCH_GET_LOCKFILE_PATH) + @patch(_PATCH_GET_APM_DIR) + @patch(_PATCH_LOCKFILE) + def test_sequential_checks_flag( + self, mock_lf_cls, mock_get_apm_dir, mock_get_path, + mock_migrate, mock_dl_cls, mock_auth, + ): + """--parallel-checks 0 forces sequential checking.""" + with self._chdir_tmp() as tmp: + mock_get_apm_dir.return_value = tmp + mock_get_path.return_value = tmp / "apm.lock.yaml" + + deps = { + "org/alpha": _locked_dep( + "org/alpha", resolved_ref=None, + resolved_commit="aaa", + ), + "org/beta": _locked_dep( + "org/beta", resolved_ref=None, + resolved_commit="aaa", + ), + } + mock_lf_cls.read.return_value = _make_lockfile(deps) + + mock_downloader = MagicMock() + mock_downloader.list_remote_refs.return_value = [ + _remote_branch("main", sha="aaa"), + ] + mock_dl_cls.return_value = mock_downloader + + result = self.runner.invoke( + cli, ["outdated", "--parallel-checks", "0"] + ) + + assert result.exit_code == 0 + assert "up-to-date" in result.output.lower() + assert mock_downloader.list_remote_refs.call_count == 2 + + @patch(_PATCH_AUTH) + @patch(_PATCH_DOWNLOADER) + @patch(_PATCH_MIGRATE) + @patch(_PATCH_GET_LOCKFILE_PATH) + @patch(_PATCH_GET_APM_DIR) + @patch(_PATCH_LOCKFILE) + def test_parallel_checks_custom_value( + self, mock_lf_cls, mock_get_apm_dir, mock_get_path, + mock_migrate, mock_dl_cls, mock_auth, + ): + """--parallel-checks 2 uses at most 2 workers but checks all deps.""" + with self._chdir_tmp() as tmp: + mock_get_apm_dir.return_value = tmp + mock_get_path.return_value = tmp / "apm.lock.yaml" + + deps = { + f"org/pkg{i}": _locked_dep( + f"org/pkg{i}", resolved_ref="v1.0.0", + resolved_commit="aaa", + ) + for i in range(4) + } + mock_lf_cls.read.return_value = _make_lockfile(deps) + + mock_downloader = MagicMock() + mock_downloader.list_remote_refs.return_value = [ + _remote_tag("v2.0.0"), + _remote_tag("v1.0.0"), + ] + mock_dl_cls.return_value = mock_downloader + + result = self.runner.invoke( + cli, ["outdated", "-j", "2"] + ) + + assert result.exit_code == 0 + assert mock_downloader.list_remote_refs.call_count == 4 + assert "outdated" in result.output.lower() + + @patch(_PATCH_AUTH) + @patch(_PATCH_DOWNLOADER) + @patch(_PATCH_MIGRATE) + @patch(_PATCH_GET_LOCKFILE_PATH) + @patch(_PATCH_GET_APM_DIR) + @patch(_PATCH_LOCKFILE) + def test_parallel_check_exception_handled( + self, mock_lf_cls, mock_get_apm_dir, mock_get_path, + mock_migrate, mock_dl_cls, mock_auth, + ): + """A failing remote check in parallel mode should not crash.""" + with self._chdir_tmp() as tmp: + mock_get_apm_dir.return_value = tmp + mock_get_path.return_value = tmp / "apm.lock.yaml" + + deps = { + "org/good": _locked_dep( + "org/good", resolved_ref="v1.0.0", + resolved_commit="aaa", + ), + "org/bad": _locked_dep( + "org/bad", resolved_ref="v1.0.0", + resolved_commit="bbb", + ), + } + mock_lf_cls.read.return_value = _make_lockfile(deps) + + mock_downloader = MagicMock() + + def _side_effect(dep_ref): + if "bad" in (dep_ref.repo_url or ""): + raise ConnectionError("network down") + return [_remote_tag("v1.0.0")] + + mock_downloader.list_remote_refs.side_effect = _side_effect + mock_dl_cls.return_value = mock_downloader + + result = self.runner.invoke( + cli, ["outdated", "-j", "4"] + ) + + # Should not crash -- bad dep becomes "unknown" + assert result.exit_code == 0 + assert "unknown" in result.output.lower() + + # --- ADO dependency handling --- + + @patch(_PATCH_AUTH) + @patch(_PATCH_DOWNLOADER) + @patch(_PATCH_MIGRATE) + @patch(_PATCH_GET_LOCKFILE_PATH) + @patch(_PATCH_GET_APM_DIR) + @patch(_PATCH_LOCKFILE) + def test_ado_dep_builds_correct_reference( + self, mock_lf_cls, mock_get_apm_dir, mock_get_path, + mock_migrate, mock_dl_cls, mock_auth, + ): + """ADO deps (host=dev.azure.com) should pass full URL to DependencyReference.parse().""" + with self._chdir_tmp() as tmp: + mock_get_apm_dir.return_value = tmp + mock_get_path.return_value = tmp / "apm.lock.yaml" + + ado_dep = LockedDependency( + repo_url="myorg/myproject/_git/myrepo", + host="dev.azure.com", + resolved_ref="v1.0.0", + resolved_commit="aaa", + ) + deps = {"myorg/myproject/_git/myrepo": ado_dep} + mock_lf_cls.read.return_value = _make_lockfile(deps) + + mock_downloader = MagicMock() + mock_downloader.list_remote_refs.return_value = [ + _remote_tag("v1.0.0"), + ] + mock_dl_cls.return_value = mock_downloader + + result = self.runner.invoke(cli, ["outdated"]) + + assert result.exit_code == 0 + # Verify list_remote_refs was called (dep was not silently skipped) + mock_downloader.list_remote_refs.assert_called_once()