Skip to content

bonprosoft/rules_uv_bare

Repository files navigation

rules_uv_bare

Bazel rules for Python projects using uv as the package manager. Keep your Python packaging in the Python ecosystem (pyproject.toml + uv) and bridge it into Bazel with a minimal effort. The rules also map naturally to uv's workspace feature, giving a fast and incremental installs.

This approach trades away some of Bazel’s core strengths — hermetic/sandboxed builds, remote execution, and fine-grained deps tracking — but in return allows the project to integrate seamlessly with the standard Python ecosystem.

Why rules_uv_bare?

Suppose that you have a uv workspace with two packages:

├── pyproject.toml  # uv workspace root
├── my_package_a
│   ├── pyproject.toml
│   └── ...
├── my_package_b
│   ├── pyproject.toml
│   └── ...
...

With the standard rules_python approach, you need to write a BUILD.bazel file for every package to describe its build information and metadata, such as srcs, data, deps, and so on. In particular, dependency information ends up duplicated in two places (pyproject.toml and BUILD files) and keeping them in sync is left to you.

There is also no built-in counterpart to uv's workspace concept; third-party packages are pinned in MODULE.bazel and every target must explicitly declare the dependencies it needs.

Here is what a typical rules_python setup looks like:

# my_package_a/BUILD.bazel
py_library(
    name = "my_package_a",
    imports = ["."],
    srcs = glob(["my_package_a/**/*.py"]),
    deps = [
        "@pip//pydantic",
    ],
)

# my_package_b/BUILD.bazel
py_library(
    name = "my_package_b",
    imports = ["."],
    srcs = glob(["my_package_b/**/*.py"]),
    deps = [
        "@pip//numpy",
        "//my_package_a",
    ],
)

# MODULE.bazel
pip = use_extension("@rules_python//python/extensions:pip.bzl", "pip")
pip.parse(
    hub_name = "pip",
    python_version = "3.12",
    requirements_lock = "//:requirements.txt",
)
use_repo(pip, "pip")

With rules_uv_bare, the BUILD files shrink to bare declarations:

# my_package_a/BUILD.bazel
uv_py_package(name = "my_package_a")

# my_package_b/BUILD.bazel
uv_py_package(name = "my_package_b")

# BUILD.bazel
uv_py_workspace(
    name = "workspace",
    members = ["//my_package_a", "//my_package_b"],
    lock = "uv.lock",
)

Most of dependency metadata stays in pyproject.toml, which serves as the single source of truth already understood by the most of Python ecosystem. You can still declare explicit deps if you need Bazel features like bazel cquery.

Features:

  • Seamless Python ecosystem integration: Most Python metadata such as dependencies stays in the standard pyproject.toml format. No need to mirror them into Bazel targets or maintain a separate lock-file translation layer.
  • Multi-package workspaces: Maps naturally to uv's workspace concept. Multiple Python packages share a single lock file and virtualenv, with inter-package dependencies declared via standard [tool.uv.sources]. You can also define more than one workspaces with different third-party dependencies in the same Bazel module, and packages can belong to multiple workspaces.

Quick Start

1. Add the module dependency

In your MODULE.bazel:

bazel_dep(name = "rules_uv_bare", version = "0.0.1")

2. Declare a Python package

Each Python package has its own pyproject.toml and BUILD.bazel:

# my_package/pyproject.toml
[project]
name = "my-package"
version = "0.0.1"
requires-python = ">=3.12"
dependencies = ["numpy>=1.26,<3"]
# my_package/BUILD.bazel
load("@rules_uv_bare//uv:defs.bzl", "uv_py_package")

uv_py_package(name = "my_package")

3. Create a workspace

A workspace groups packages together and manages their shared virtualenv:

# BUILD.bazel
load("@rules_uv_bare//uv:defs.bzl", "uv_py_entrypoint", "uv_py_lock", "uv_py_test", "uv_py_workspace")

uv_py_workspace(
    name = "my_workspace",
    members = ["//my_package"],
    lock = "uv.lock",
)

uv_py_lock(
    name = "my_workspace.lock",
    workspace = ":my_workspace",
)

uv_py_entrypoint(
    name = "run",
    workspace = ":my_workspace",
    cmd = ["python", "-m", "my_package"],
)

uv_py_test(
    name = "test",
    workspace = ":my_workspace",
    cmd = ["python", "-m", "pytest", "tests/"],
)

4. Generate the lock file

bazel run //:my_workspace.lock

5. Build and test

bazel build //:run
bazel test //:test

Rules Reference

See docs/rules.md for the full API reference (generated by Stardoc).

Examples

Multi-Package Workspaces

Packages within a workspace can depend on each other using uv's workspace sources in their pyproject.toml:

# pkg_b/pyproject.toml
[project]
name = "pkg-b"
dependencies = ["pkg-a"]
# pkg_a/BUILD.bazel
uv_py_package(name = "pkg_a")
# pkg_b/BUILD.bazel
uv_py_package(name = "pkg_b")
# BUILD.bazel
uv_py_workspace(
    name = "ws",
    # pkg_b depends on pkg_a; both are in the same workspace so uv resolves it automatically.
    members = ["//pkg_a", "//pkg_b"],
    lock = "uv.lock",
)

Optionally, you can declare explicit deps in BUILD files to expose the dependency graph to Bazel.

# pkg_a/BUILD.bazel
uv_py_package(name = "pkg_a")
# pkg_b/BUILD.bazel
# Declare that pkg_b has a dependency to pkg_a
uv_py_package(name = "pkg_b", deps = ["//pkg_a"])

# BUILD.bazel
uv_py_workspace(
    name = "ws",
    members = ["//pkg_b"],   # pkg_a is also included automatically
    lock = "uv.lock",
)

rules_python Integration

Build a .whl from your target:

load("@rules_python//python:packaging.bzl", "py_wheel")

py_wheel(
    name = "my_ext_wheel",
    package = ":my_ext_lib",
)

Import the wheel:

uv_py_import_wheel(
    name = "my_ext",
    src = ":my_ext_wheel",
)

Pass the wheel to a package (via wheel_deps) or directly to the workspace (via wheels):

# Option A: via uv_py_package (collected transitively)
uv_py_package(
    name = "my_package",
    wheel_deps = [":my_ext"],
)
uv_py_workspace(
    name = "ws",
    members = [":my_package"],
    lock = "uv.lock",
)

# Option B: directly on workspace
uv_py_workspace(
    name = "ws",
    members = [":my_package"],
    wheels = [":my_ext"],
    lock = "uv.lock",
)

See examples/import_from_rules_python/ for more details.

Native build / cross-platform deployment

C++ extensions (e.g. via nanobind or pybind11) can be integrated by building a wheel and importing it into the workspace.

When target_platforms is passed to uv_py_workspace, it creates a single unified uv.lock covering all listed platforms (i.e., the same way as uv). Internally, it uses Bazel split transitions to build each wheel once per target platform, and writes marker-qualified [tool.uv.sources] entries (e.g. platform_machine == 'x86_64') so that uv picks the correct wheel for each platform from one lock file.

Because uv sync runs locally, building the workspace target and its sub-targets (.run, .activate) only work when the host platform matches the target platform. Other rules, such as uv_py_lock, uv_py_export, and uv_py_deploy, would work on any platform.

See examples/native and examples/native_cross/ for full working examples.

Advanced Examples

Private PyPI registries

uv_py_workspace(
    name = "ws",
    members = ["//my_package"],
    lock = "uv.lock",
    uv_sync_args = [
        "--index-url", "https://private.pypi.org/simple",
        "--extra-index-url", "https://pypi.org/simple",
    ],
)

Custom tool configuration

uv_py_workspace(
    name = "ws",
    members = ["//my_package"],
    lock = "uv.lock",
    extra_pyproject_content = """
[tool.pytest.ini_options]
testpaths = ["tests"]
addopts = "-v"
""",
)

Custom dependency groups

uv_py_workspace(
    name = "ws",
    members = ["//my_package"],
    lock = "uv.lock",
    dependency_groups = {
        "test": ["pytest>=8.0", "pytest-cov"],
        "lint": ["ruff>=0.4"],
    },
)

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors