Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[IMP] Extend NiftiDataLoader Operator to output Image object #421

Open
jtetrea opened this issue May 1, 2023 · 3 comments
Open

[IMP] Extend NiftiDataLoader Operator to output Image object #421

jtetrea opened this issue May 1, 2023 · 3 comments
Assignees
Labels
enhancement New feature or request

Comments

@jtetrea
Copy link

jtetrea commented May 1, 2023

Is your enhancement request related to a problem? Please describe.

NiftiDataLoader operator currently outputs a np.ndarray. Need to extend this to wrap in an Image object with correct meta/header information to more easily work with existing MAP workflows.

Describe the solution you'd like

NiftiDataLoader with Image object as IN_MEMORY output.

Describe alternatives you've considered

An alternative would be to use the original DICOM images and take advantage of existing operators, but this is just not possible in some cases where Nifti files are the only form of data.

Additional context

An enhancement like this could enable a workflow such as processing a large batch (10,000+) of Nifti images for rapid inference.

@jtetrea jtetrea added the enhancement New feature or request label May 1, 2023
@jtetrea
Copy link
Author

jtetrea commented May 1, 2023

@vikashg

@vikashg
Copy link
Collaborator

vikashg commented May 2, 2023

@jtetrea I used SimpleITK as an image loader for the nifti image. However if you use LoadImage from monai it outputs some meta data and the image array. I think we can change the loader to LoadImage and pass on the meta information. Will that solve the problem ?

@MMelQin
Copy link
Collaborator

MMelQin commented May 3, 2023

The MONAI Core LoadImage function with its nii reader also output MetaTensor, essentially Torch Tensor with the few piece of metadata returned from nii reader. The Deploy App SDK inference operators' internal and the InMemImageReader had been updated to support MetaTensor in v0.5.
So, we'll consider supporting MetaTensor on the inference operator input type, hence feeding the MetaTensor from nii data loader directly to the inference operator.

@MMelQin MMelQin assigned MMelQin and jtetrea and unassigned MMelQin Jun 23, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants