Skip to content

Changing input size #1

@sulaimanvesal

Description

@sulaimanvesal

Hi,

I was able to reproduce the results on Brats2018 data. However, I was training the model on another dataset with different input size (20x256x256 or 32x256x256) but getting error on this part:

# Decoder
d_base_feat_size = [4, 5, 6]
# PatchExpand Output
torch.Size([1, 544, 1280]) # x.shape
8 10 12 1 544 1280 # print(D, H, W, B, L, C)
---------------------------------------------
x = x.view(B, D, H, W, C)
RuntimeError: shape '[1, 8, 10, 12, 1280]' is invalid for input of size 696320

How do you compute d_base_feat_size for an input size of (20x256x256 or 32x256x256)?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions