GithubHelp home page GithubHelp logo

Comments (4)

KumoLiu avatar KumoLiu commented on August 17, 2024

Hi @idinsmore1, I guess the error may be due to the ToNumpyd in the postprocessing_transform . Could you please remove it and try again?
Thanks.

from monai.

idinsmore1 avatar idinsmore1 commented on August 17, 2024

Hi @KumoLiu Unfortunately this did not work, here's the full traceback

OutOfMemoryError Traceback (most recent call last)
File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/transform.py:141, in apply_transform(transform, data, map_items, unpack_items, log_stats, lazy, overrides)
140 return [_apply_transform(transform, item, unpack_items, lazy, overrides, log_stats) for item in data]
--> 141 return _apply_transform(transform, data, unpack_items, lazy, overrides, log_stats)
142 except Exception as e:
143 # if in debug mode, don't swallow exception so that the breakpoint
144 # appears where the exception was raised.

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/transform.py:98, in _apply_transform(transform, data, unpack_parameters, lazy, overrides, logger_name)
96 return transform(*data, lazy=lazy) if isinstance(transform, LazyTrait) else transform(*data)
---> 98 return transform(data, lazy=lazy) if isinstance(transform, LazyTrait) else transform(data)

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/spatial/dictionary.py:527, in Spacingd.inverse(self, data)
526 for key in self.key_iterator(d):
--> 527 d[key] = self.spacing_transform.inverse(cast(torch.Tensor, d[key]))
528 return d

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/spatial/array.py:543, in Spacing.inverse(self, data)
542 def inverse(self, data: torch.Tensor) -> torch.Tensor:
--> 543 return self.sp_resample.inverse(data)

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/spatial/array.py:248, in SpatialResample.inverse(self, data)
246 with self.trace_transform(False):
247 # we can't use self.__call__ in case a child class calls this inverse.
--> 248 out: torch.Tensor = SpatialResample.call(self, data, **kw_args)
249 kw_args["src_affine"] = kw_args.get("dst_affine")

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/spatial/array.py:223, in SpatialResample.call(self, img, dst_affine, spatial_size, mode, padding_mode, align_corners, dtype, lazy)
222 lazy_ = self.lazy if lazy is None else lazy
--> 223 return spatial_resample(
224 img,
225 dst_affine,
226 spatial_size,
227 mode,
228 padding_mode,
229 align_corners,
230 dtype_pt,
231 lazy=lazy_,
232 transform_info=self.get_transform_info(),
233 )

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/spatial/functional.py:178, in spatial_resample(img, dst_affine, spatial_size, mode, padding_mode, align_corners, dtype_pt, lazy, transform_info)
175 affine_xform = AffineTransform( # type: ignore
176 normalized=False, mode=_m, padding_mode=_p, align_corners=align_corners, reverse_indexing=True
177 )
--> 178 img = affine_xform(img.unsqueeze(0), theta=xform.to(img), spatial_size=spatial_size).squeeze(0) # type: ignore
179 if additional_dims:

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/torch/nn/modules/module.py:1518, in Module._wrapped_call_impl(self, *args, **kwargs)
1517 else:
-> 1518 return self._call_impl(*args, **kwargs)

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/torch/nn/modules/module.py:1527, in Module._call_impl(self, *args, **kwargs)
1524 if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks or self._forward_pre_hooks
1525 or _global_backward_pre_hooks or _global_backward_hooks
1526 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1527 return forward_call(*args, **kwargs)
1529 try:

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/networks/layers/spatial_transforms.py:579, in AffineTransform.forward(self, src, theta, spatial_size)
575 raise ValueError(
576 f"affine and image batch dimension must match, got affine={theta.shape[0]} image={src_size[0]}."
577 )
--> 579 grid = nn.functional.affine_grid(theta=theta[:, :sr], size=list(dst_size), align_corners=self.align_corners)
580 dst = nn.functional.grid_sample(
581 input=src.contiguous(),
582 grid=grid,
(...)
585 align_corners=self.align_corners,
586 )

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/torch/nn/functional.py:4399, in affine_grid(theta, size, align_corners)
4397 raise ValueError(f"Expected non-zero, positive output size. Got {size}")
-> 4399 return torch.affine_grid_generator(theta, size, align_corners)

OutOfMemoryError: CUDA out of memory. Tried to allocate 11.53 GiB. GPU 0 has a total capacty of 31.75 GiB of which 5.47 GiB is free. Including non-PyTorch memory, this process has 26.27 GiB memory in use. Of the allocated memory 24.98 GiB is allocated by PyTorch, and 158.96 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

The above exception was the direct cause of the following exception:

RuntimeError Traceback (most recent call last)
File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/transform.py:141, in apply_transform(transform, data, map_items, unpack_items, log_stats, lazy, overrides)
140 return [_apply_transform(transform, item, unpack_items, lazy, overrides, log_stats) for item in data]
--> 141 return _apply_transform(transform, data, unpack_items, lazy, overrides, log_stats)
142 except Exception as e:
143 # if in debug mode, don't swallow exception so that the breakpoint
144 # appears where the exception was raised.

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/transform.py:98, in _apply_transform(transform, data, unpack_parameters, lazy, overrides, logger_name)
96 return transform(*data, lazy=lazy) if isinstance(transform, LazyTrait) else transform(*data)
---> 98 return transform(data, lazy=lazy) if isinstance(transform, LazyTrait) else transform(data)

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/post/dictionary.py:706, in Invertd.call(self, data)
705 with allow_missing_keys_mode(self.transform): # type: ignore
--> 706 inverted = self.transform.inverse(input_dict)
708 # save the inverted data

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/compose.py:364, in Compose.inverse(self, data)
363 for t in reversed(invertible_transforms):
--> 364 data = apply_transform(
365 t.inverse, data, self.map_items, self.unpack_items, lazy=False, log_stats=self.log_stats
366 )
367 return data

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/transform.py:171, in apply_transform(transform, data, map_items, unpack_items, log_stats, lazy, overrides)
170 _log_stats(data=data)
--> 171 raise RuntimeError(f"applying transform {transform}") from e

RuntimeError: applying transform <bound method Spacingd.inverse of <monai.transforms.spatial.dictionary.Spacingd object at 0x7f2485cff450>>

The above exception was the direct cause of the following exception:

RuntimeError Traceback (most recent call last)
Cell In[6], line 15
14 print('postprocessing on gpu')
---> 15 out = [gpu_postprocessing(i) for i in decollate_batch(data)][0]
16 except RuntimeError as e:

Cell In[6], line 15, in (.0)
14 print('postprocessing on gpu')
---> 15 out = [gpu_postprocessing(i) for i in decollate_batch(data)][0]
16 except RuntimeError as e:

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/compose.py:335, in Compose.call(self, input_, start, end, threading, lazy)
334 _lazy = self.lazy if lazy is None else lazy
--> 335 result = execute_compose(
336 input
,
337 transforms=self.transforms,
338 start=start,
339 end=end,
340 map_items=self.map_items,
341 unpack_items=self.unpack_items,
342 lazy=_lazy,
343 overrides=self.overrides,
344 threading=threading,
345 log_stats=self.log_stats,
346 )
348 return result

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/compose.py:111, in execute_compose(data, transforms, map_items, unpack_items, start, end, lazy, overrides, threading, log_stats)
110 _transform = deepcopy(_transform) if isinstance(_transform, ThreadUnsafe) else _transform
--> 111 data = apply_transform(
112 _transform, data, map_items, unpack_items, lazy=lazy, overrides=overrides, log_stats=log_stats
113 )
114 data = apply_pending_transforms(data, None, overrides, logger_name=log_stats)

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/transform.py:171, in apply_transform(transform, data, map_items, unpack_items, log_stats, lazy, overrides)
170 _log_stats(data=data)
--> 171 raise RuntimeError(f"applying transform {transform}") from e

RuntimeError: applying transform <monai.transforms.post.dictionary.Invertd object at 0x7f2485d07ad0>

During handling of the above exception, another exception occurred:

RuntimeError Traceback (most recent call last)
File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/transform.py:141, in apply_transform(transform, data, map_items, unpack_items, log_stats, lazy, overrides)
140 return [_apply_transform(transform, item, unpack_items, lazy, overrides, log_stats) for item in data]
--> 141 return _apply_transform(transform, data, unpack_items, lazy, overrides, log_stats)
142 except Exception as e:
143 # if in debug mode, don't swallow exception so that the breakpoint
144 # appears where the exception was raised.

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/transform.py:98, in _apply_transform(transform, data, unpack_parameters, lazy, overrides, logger_name)
96 return transform(*data, lazy=lazy) if isinstance(transform, LazyTrait) else transform(*data)
---> 98 return transform(data, lazy=lazy) if isinstance(transform, LazyTrait) else transform(data)

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/spatial/dictionary.py:527, in Spacingd.inverse(self, data)
526 for key in self.key_iterator(d):
--> 527 d[key] = self.spacing_transform.inverse(cast(torch.Tensor, d[key]))
528 return d

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/spatial/array.py:543, in Spacing.inverse(self, data)
542 def inverse(self, data: torch.Tensor) -> torch.Tensor:
--> 543 return self.sp_resample.inverse(data)

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/spatial/array.py:236, in SpatialResample.inverse(self, data)
235 def inverse(self, data: torch.Tensor) -> torch.Tensor:
--> 236 transform = self.pop_transform(data)
237 # Create inverse transform

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/inverse.py:328, in TraceableTransform.pop_transform(self, data, key, check)
314 """
315 Return and pop the most recent transform.
316
(...)
326 - RuntimeError: data is neither MetaTensor nor dictionary
327 """
--> 328 return self.get_most_recent_transform(data, key, check, pop=True)

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/inverse.py:299, in TraceableTransform.get_most_recent_transform(self, data, key, check, pop)
298 if not self.tracing:
--> 299 raise RuntimeError("Transform Tracing must be enabled to get the most recent transform.")
300 if isinstance(data, MetaTensor):

RuntimeError: Transform Tracing must be enabled to get the most recent transform.

The above exception was the direct cause of the following exception:

RuntimeError Traceback (most recent call last)
File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/transform.py:141, in apply_transform(transform, data, map_items, unpack_items, log_stats, lazy, overrides)
140 return [_apply_transform(transform, item, unpack_items, lazy, overrides, log_stats) for item in data]
--> 141 return _apply_transform(transform, data, unpack_items, lazy, overrides, log_stats)
142 except Exception as e:
143 # if in debug mode, don't swallow exception so that the breakpoint
144 # appears where the exception was raised.

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/transform.py:98, in _apply_transform(transform, data, unpack_parameters, lazy, overrides, logger_name)
96 return transform(*data, lazy=lazy) if isinstance(transform, LazyTrait) else transform(*data)
---> 98 return transform(data, lazy=lazy) if isinstance(transform, LazyTrait) else transform(data)

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/post/dictionary.py:706, in Invertd.call(self, data)
705 with allow_missing_keys_mode(self.transform): # type: ignore
--> 706 inverted = self.transform.inverse(input_dict)
708 # save the inverted data

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/compose.py:364, in Compose.inverse(self, data)
363 for t in reversed(invertible_transforms):
--> 364 data = apply_transform(
365 t.inverse, data, self.map_items, self.unpack_items, lazy=False, log_stats=self.log_stats
366 )
367 return data

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/transform.py:171, in apply_transform(transform, data, map_items, unpack_items, log_stats, lazy, overrides)
170 _log_stats(data=data)
--> 171 raise RuntimeError(f"applying transform {transform}") from e

RuntimeError: applying transform <bound method Spacingd.inverse of <monai.transforms.spatial.dictionary.Spacingd object at 0x7f2485cff450>>

The above exception was the direct cause of the following exception:

RuntimeError Traceback (most recent call last)
Cell In[6], line 18
16 except RuntimeError as e:
17 print('switching to cpu')
---> 18 out = [cpu_postprocessing(i) for i in decollate_batch(data)][0]
20 # raise RuntimeError('test runtime error')
21 else:
22 print('postprocessing on cpu')

Cell In[6], line 18, in (.0)
16 except RuntimeError as e:
17 print('switching to cpu')
---> 18 out = [cpu_postprocessing(i) for i in decollate_batch(data)][0]
20 # raise RuntimeError('test runtime error')
21 else:
22 print('postprocessing on cpu')

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/compose.py:335, in Compose.call(self, input_, start, end, threading, lazy)
333 def call(self, input_, start=0, end=None, threading=False, lazy: bool | None = None):
334 _lazy = self.lazy if lazy is None else lazy
--> 335 result = execute_compose(
336 input
,
337 transforms=self.transforms,
338 start=start,
339 end=end,
340 map_items=self.map_items,
341 unpack_items=self.unpack_items,
342 lazy=_lazy,
343 overrides=self.overrides,
344 threading=threading,
345 log_stats=self.log_stats,
346 )
348 return result

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/compose.py:111, in execute_compose(data, transforms, map_items, unpack_items, start, end, lazy, overrides, threading, log_stats)
109 if threading:
110 _transform = deepcopy(_transform) if isinstance(_transform, ThreadUnsafe) else _transform
--> 111 data = apply_transform(
112 _transform, data, map_items, unpack_items, lazy=lazy, overrides=overrides, log_stats=log_stats
113 )
114 data = apply_pending_transforms(data, None, overrides, logger_name=log_stats)
115 return data

File ~/mambaforge/envs/monai/lib/python3.11/site-packages/monai/transforms/transform.py:171, in apply_transform(transform, data, map_items, unpack_items, log_stats, lazy, overrides)
169 else:
170 _log_stats(data=data)
--> 171 raise RuntimeError(f"applying transform {transform}") from e

RuntimeError: applying transform <monai.transforms.post.dictionary.Invertd object at 0x7f2485cff890>

from monai.

idinsmore1 avatar idinsmore1 commented on August 17, 2024

I've been testing this, and I believe the error is stemming from the call to SpatialResample when performing the inverse of Spacingd after the error catch as that is the first transformation to be performed. I manually set all the transforms' tracing attributes to True using

preprocessing.tracing = True
for transform in preprocessing.transforms:
    transform.tracing = True
cpu_postprocessing.tracing = True
for transform in cpu_postprocessing.transforms:
    transform.tracing = True
gpu_postprocessing.tracing = True
for transform in gpu_postprocessing.transforms:
    transform.tracing = True

And the error still occurs. The output of both data['image'].applied_operations and data['pred'].applied_operations has all transformations set to tracing: True and I checked data['image'].applied_operations == data['pred'].applied_operations == True. The only transformation not listed here is SpatialResample, which would make some sense as to why this would not work even after manually setting this attribute.

from monai.

idinsmore1 avatar idinsmore1 commented on August 17, 2024

Ok so actually got this working, I'm going to assume that this is not MONAI's expected/desired behavior in this instance. When running this inference loop, preprocessing.transforms[-2] is the Spacingd transform. Before the first instance of the exception, preprocessing.transforms[-2].spacing_transform.sp_resample.tracing == True, which is the tracing attribute for the SpatialResample call within Spacingd. After the catch of the error, preprocessing.transforms[-2].spacing_transform.sp_resample.tracing == False, which breaks the Invertd transform. So, if you manually reset the preprocessing tracing attributes like this:

def reset_tracing(preprocessing):
    preprocessing.tracing = True
    for transform in preprocessing.transforms:
         transform.tracing = True
    preprocessing.transforms[-2].spacing_transform.sp_resample.tracing = True
    return preprocessing

and insert this function into the exception, everything works as expected.

preprocessing = Compose([
            LoadImaged(keys=['image']),
            EnsureChannelFirstd(keys=['image']),
            ThresholdIntensityd(keys=['image'], threshold=task_config['percentile_95'], above=False, cval=task_config['percentile_95']),
            ThresholdIntensityd(keys=['image'], threshold=task_config['percentile_05'], above=True, cval=task_config['percentile_05']),
            NormalizeIntensityd(keys=['image'], subtrahend=task_config['mean'], divisor=task_config['std']),
            CropForegroundd(keys=["image"], source_key="image", allow_smaller=True, select_fn=lambda x: x > task_config['crop_threshold']),
            Orientationd(keys=['image'], axcodes='RAS'),
            Spacingd(keys=['image'], pixdim=task_config['spacing'], mode='bilinear'),
            EnsureTyped(keys=['image'], track_meta=True)
        ])
postprocessing_transform = Compose([
      Activationsd(keys=['pred'], softmax=True),
      AsDiscreted(keys=['pred'], argmax=True),
      Invertd(keys=['pred'], transform=preprocessing, orig_keys='image', meta_keys='image_meta_dict', nearest_interp=True, to_tensor=True),
      SqueezeDimd(keys=['pred'], dim=0),
      ToNumpyd(keys=['pred'], dtype=np.uint8)
  ])
gpu_postprocessing = Compose([EnsureTyped(keys=['pred'], device=device), postprocessing_transform])
cpu_postprocessing = Compose([EnsureTyped(keys=['pred'], device='cpu'), postprocessing_transform])

dataset = CacheDataset(data_dict, cache_rate=1.0, transform=preprocessing, num_workers=4)
dataloader = ThreadDataLoader(dataset, batch_size=1, num_workers=0, pin_memory=True)
# set up the adaptive inferer
inferer = SlidingWindowInfererAdapt(roi_size=model_config[task]['patch_size'], sw_batch_size=batch_size, overlap=0.5)
# Run the inference loop
with autocast():
    with torch.no_grad():
        for data in dataloader:
            images = data['image'].to(device)
            # Run inference
            start_time = time.time()
            pred = inferer(inputs=images, network=model)
            inference_time = round(time.time() - start_time, 2)
            data['pred'] = pred
            # Delete the images to save gpu memory
            del images 
            # Run postprocessing. Only have one item so take index 0
            processing_start = time.time()
            # Attempt to run postprocessing on GPU, if it fails due to OOM, run it on CPU
            # If the prediction is on CPU (== -1), we just go right to CPU postprocessing
            if data['pred'].get_device() != -1:
                try:
                    out = [gpu_postprocessing(i) for i in decollate_batch(data)][0]
                except RuntimeError as e: # this is almost always an OOM error
                    print('Switching to CPU for postprocessing')
                    preprocessing = reset_tracing(preprocessing)
                    out = [cpu_postprocessing(i) for i in decollate_batch(data)][0] # After the first attempt of this, every following postprocessing transformation fails
            else:
                out = [cpu_postprocessing(i) for i in decollate_batch(data)][0]
            write_prediction(out)
            del pred
            del out

from monai.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.