Skip to content

HKUST-SAIL/tinyexp

Repository files navigation

Main codecov

TinyExp

Simple experiment management for PyTorch.

TinyExp is built around one idea: your configured experiment is your entrypoint.

TinyExp demo

Instead of splitting config, launcher, and execution across many files, TinyExp keeps them together in one experiment definition so iteration stays fast and predictable.

What you get in practice:

  • Experiment-centered configuration (Hydra/OmegaConf)
  • CLI overrides without rewriting code
  • Keep your training loop close to plain PyTorch
  • Run the same experiment definition from local debug to distributed launch

Why TinyExp

TinyExp focuses on simple, maintainable experiment management:

  • Your experiment code stays readable.
  • Your config stays structured and easy to override.
  • Your execution path stays consistent as experiments grow.

Design Philosophy

TinyExp is intentionally light.

It is not trying to be a heavy trainer framework that owns your epoch loop, callback system, or full runtime lifecycle. Instead, it focuses on a smaller goal:

  • keep the experiment itself as the main entrypoint
  • keep the training loop in user space
  • make configuration and launch behavior explicit
  • expose shared capabilities through focused XXXCfg components
  • provide thin helpers rather than framework-owned control flow
  • treat examples as reusable recipes, not just demos

In short, TinyExp should help you write less experiment plumbing, not less experiment logic.

For a longer explanation, see docs/philosophy.md.

Quick Start (1 Minute)

Option A: Install with pip and use import-based entrypoint

pip install tinyexp
from tinyexp import store_and_run_exp
from tinyexp.examples.mnist_exp import Exp

store_and_run_exp(Exp)
python your_exp.py
python your_exp.py dataloader_cfg.train_batch_size_per_device=16

Option B: Run the bundled example from source (for development)

git clone https://github.com/HKUST-SAIL/tinyexp.git
cd tinyexp
make install
uv run python tinyexp/examples/mnist_exp.py

Common Commands

Run MNIST with config override:

uv run python tinyexp/examples/mnist_exp.py dataloader_cfg.train_batch_size_per_device=16

Print all available configs:

uv run python tinyexp/examples/mnist_exp.py mode=help

Print all configs plus your overrides:

uv run python tinyexp/examples/mnist_exp.py mode=help dataloader_cfg.train_batch_size_per_device=16

Example Experiments

For ImageNet example:

export IMAGENET_HOME=/path/to/imagenet
uv run python tinyexp/examples/resnet_exp.py

How It Works

  1. Define an experiment class by inheriting TinyExp.
  2. Keep model/data/optimizer/scheduler config in nested dataclasses.
  3. Implement run() (and train/eval helpers) in the same experiment definition.
  4. Launch the script and override config from CLI when needed.

This gives you a single, explicit place to manage experiment behavior.

Development

Install environment and hooks:

make install

Run checks:

make check

Run tests:

make test

Build docs:

make docs-test

Build package:

make build

Release:

make release VERSION=0.0.4

Documentation

Contributing

PRs and issues are welcome. See CONTRIBUTING.md.

License

MIT License. See LICENSE.

About

No description, website, or topics provided.

Resources

License

Contributing

Stars

Watchers

Forks

Packages

 
 
 

Contributors