Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

keepdim not working as expected in AugmentationSequential #2848

Open
ashnair1 opened this issue Mar 20, 2024 · 1 comment
Open

keepdim not working as expected in AugmentationSequential #2848

ashnair1 opened this issue Mar 20, 2024 · 1 comment
Labels
bug 🐛 Something isn't working help wanted Extra attention is needed module: augmentations

Comments

@ashnair1
Copy link
Contributor

ashnair1 commented Mar 20, 2024

Describe the bug

keepdim is supposed to retain the dimensions of the input tensor passed. However here passing a CHW tensor into an augmentation results in a batch dim being added.

Reproduction steps

import torch
import kornia.augmentation as K

img = torch.randn(3, 256, 256)
masks = [torch.ones( 3, 256, 256), torch.ones(2, 256, 256)]

aug_dict = K.AugmentationSequential(
    K.RandomHorizontalFlip(p=1.0),
    data_keys=None,
    keepdim=True,
)

data = {'image': img, 'mask': masks}
out = aug_dict(data)


if img.shape != out["image"].shape:
    raise AssertionError(f"Expected {img.shape}, got {out['image'].shape}")

Expected behavior

The above snippet shouldn't raise an assertion error. It shouldn't add a batch dim here

Environment

wget https://raw.githubusercontent.com/pytorch/pytorch/main/torch/utils/collect_env.py
# For security purposes, please check the contents of collect_env.py before running it.
python collect_env.py

kornia==0.7.2
  • PyTorch Version (e.g., 1.0): 2.2.0
  • OS (e.g., Linux): Linux
  • How you installed PyTorch (conda, pip, source): pip
  • Build command you used (if compiling from source): pip install -e .
  • Python version: 3.11
  • CUDA/cuDNN version:
  • GPU models and configuration:
  • Any other relevant information:
@ashnair1 ashnair1 added the help wanted Extra attention is needed label Mar 20, 2024
@johnnv1 johnnv1 added the bug 🐛 Something isn't working label Mar 20, 2024
@johnnv1
Copy link
Member

johnnv1 commented Mar 20, 2024

thanks for the report, I think it's indeed a bug, I got it here #2800 (comment) cc @shijianjian

a workaround, manually setting the keepdim after init the container... but I couldn't find at the time what was causing it, nor why the #2800 patch caused it.

...
aug_dict.keepdim = True
...

Basically, it is not being initialized and/or propagated correctly after the container is initialized. This only causes it to ignore trying to rebuild the correct dimensions before returning (by default keep dim is initialized as False).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug 🐛 Something isn't working help wanted Extra attention is needed module: augmentations
Projects
None yet
Development

No branches or pull requests

2 participants