-
-
Notifications
You must be signed in to change notification settings - Fork 939
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
keepdim not working as expected in AugmentationSequential #2848
Comments
thanks for the report, I think it's indeed a bug, I got it here #2800 (comment) cc @shijianjian a workaround, manually setting the keepdim after init the container... but I couldn't find at the time what was causing it, nor why the #2800 patch caused it. ...
aug_dict.keepdim = True
... Basically, it is not being initialized and/or propagated correctly after the container is initialized. This only causes it to ignore trying to rebuild the correct dimensions before returning (by default keep dim is initialized as False). |
Describe the bug
keepdim is supposed to retain the dimensions of the input tensor passed. However here passing a CHW tensor into an augmentation results in a batch dim being added.
Reproduction steps
Expected behavior
The above snippet shouldn't raise an assertion error. It shouldn't add a batch dim here
Environment
wget https://raw.githubusercontent.com/pytorch/pytorch/main/torch/utils/collect_env.py # For security purposes, please check the contents of collect_env.py before running it. python collect_env.py kornia==0.7.2
conda
,pip
, source): pippip install -e .
The text was updated successfully, but these errors were encountered: