Skip to content

Conversation

@wli51
Copy link
Collaborator

@wli51 wli51 commented Jan 5, 2026

This PR adds back the Generative Adversarial Network (GAN) training functionality following its removal in refactor #19 where the model forward pass logic is decoupled from trainer and promoted as a separate engine subpackage (but #19 only included the engine parts for regular UNet training).

Introduces

  1. src/virtual_stain_flow/engine/forward_groups.py:
    Added DiscriminatorForwardGroup, a standardized interface for discriminator forward passes

  2. src/virtual_stain_flow/engine/orchestrators.py:
    An additional layer of abstraction. OrchestratedStep and GANOrchestrator between the forward passing engine and the trainer classes to define complex orchestrations involving more than one model (neural network), as needed when training GANs.
    Specifically, GANOrchestrator provides training interface to separately train the generator and discriminator

  • with complexity coming from the fact that when training only the generator, the data also has to flow through the discriminator, and vice versa
  • and that the generators needed to be updated less frequently than the discriminator to ensure stable training
    hence the orchestrator abstract helps makes things cleaner.
  1. New operations for Context class

  2. Minimal tests

@wli51 wli51 marked this pull request as ready for review January 6, 2026 21:50
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I double checked and found the tests all pass, nice job. Given the great stuff going on with testing you might consider making testing a part of automated jobs with a GitHub Actions job. The benefit here is that it'd provide you and reviewers confidence about the changes within the context of the pull request (similar to how the pre-commit checks operate).

wli51 added 18 commits January 7, 2026 11:07
…ndling and streamline output dimension calculations
…to support a variety of forward function signatures in concrete loss classes, the old abstract AbstractLoss class has been replaced with a protocol and a non-abstract BaseLoss. Related modules and methods using the old abstract losses are also updated. And a new module is implemented to hourse brushed up wGAN training losses whereas the obsetele modules were removed.
… for inputs and outputs, enhancing type clarity and consistency.
…l instead of BaseGeneratorModel for consistency.
…flowLogger for optional dictionary parameters
…weights and add model registry for logging parameters
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants