Fastest way to ultilize custom loss (supported 4D tensor) for 5D tensor?

Hello all, I have a custom loss that only worked for 4D tensors such as NxBxHxW. However, I want to use the loss for my data that is 5D tensor such as NxBxDxHxW. I am wondering which way is fastest way to use the loss for my input, and how to use it? I hear that we can use the view() function. Thanks
This is the example use the loss for 4D tensor

s = myloss()
a = torch.randint(0, 255, size=(20, 3, 256, 256), dtype=torch.float32).cuda() / 255.
b = a * 0.5
a.requires_grad = True
b.requires_grad = True
loss = s(a, b)
loss.backward()

Hi John1231983,
Could you post the code that defines your loss function?

Of course, no! It is my custom loss function

Hi John1231983,
Apologies. My intention was to help you optimize the loss function and generalize it to 5d tensors.
In general, if you have a 5d tensor [N, C, H, W, D], you should utilize the information available due the fifth dimension.
You could of course reshape your tensor to [N*C, H, W, D] using .view(N * C, H, W, D). However, it may give rise to unwanted effects.
For example, in applying a weighted cross entropy loss for medical mri scans for brain tumor segmentation, per patient weight and per slice weight could give very different losses.
Hope this helps!