You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The wrap() function currently only supports the built-in TVTensor types from torchvision.
If I make a custom TVTensor, I cannot use wrap() to restore metadata in the same way.
Proposal
Allow users to register custom wrapping logic, for example using functools.singledispatch.
Alternatively, add a method or API like CustomTensor.wrap(tensor, like=obj) so each custom type can define its own wrapping.
Example
classCustomTensor(tv_tensors.TVTensor):
extra_info: str@classmethoddef_wrap(cls, tensor, *, extra_info):
out=tensor.as_subclass(cls)
out.extra_info=extra_inforeturnout# Current wrap does not handle extra_infocustom=CustomTensor(torch.ones(2,2), extra_info="abc")
result=custom+1wrapped=tv_tensors.wrap(result, like=custom) # lost metadata# 1st Proposal (dispatch)@wrap.register(CustomTensor)def_(tensor, *, like, **kwargs):
returnCustomTensor._wrap(tensor, extra_info=like.extra_info)
# 2nd proposal (wrap method)defwrap(wrappee, *, like, **kwargs):
returnlike.wrap(wrapee, like=like) # like.__wrap__(...)
Motivation, pitch
I want to create my own TVTensor types for domain-specific tasks (like depth maps, segmentation, classification, 3d point clouds etc).
Ideally, I want to use the same wrap/unwrap workflow as built-in TVTensors so transforms and kernels remain easy and clean.
Supporting custom TVTensors in wrap() helps keep the code extensible but simple. No breaking changes.
Manually re-implementing wrapping logic in every custom transform (hard to maintain)
Monkeypatching wrap() or patching transforms
Not supporting custom TVTensor use-cases at all
Additional context
This proposal enables advanced users to extend TVTensors for new applications, following existing patterns like register_kernel.
The implementation can be very minimal at first, using the existing Python singledispatch pattern.
If maintainers accept, I'm happy to draft a small PR or demo.
🚀 The feature
Problem
The
wrap()function currently only supports the built-in TVTensor types from torchvision.If I make a custom TVTensor, I cannot use
wrap()to restore metadata in the same way.Proposal
Allow users to register custom wrapping logic, for example using
functools.singledispatch.Alternatively, add a method or API like
CustomTensor.wrap(tensor, like=obj)so each custom type can define its own wrapping.Example
Motivation, pitch
wrap()helps keep the code extensible but simple. No breaking changes.Alternatives
wrap()or patching transformsAdditional context
This proposal enables advanced users to extend TVTensors for new applications, following existing patterns like
register_kernel.The implementation can be very minimal at first, using the existing Python
singledispatchpattern.If maintainers accept, I'm happy to draft a small PR or demo.