It means to make a list of tuples in the format (input filename,ground truth filename).
I use “MRI3DSegmentationDataset” for this.
Thanks for your help.
Creating tuples using a filenames shouldn’t depend on the dimension property.
Would using this codebase as a starter work for your use case?
Yes, I use this,I change my code to the following code ,but it has error.
when I load my data and make pair input. I use “MRI3DvolumesegmentationDataset” and “MRI3DsegmentationDataset”.
first , I use “MRI3DvolumesegmentationDataset”, but it has error that Input shape of each dimension should be a multiple of length plus 2 * padding .
I don’t know what I do.
Is MRI3DvolumesegmentationDataset like to make patches?
ROOT_DIR= "/home/elahe/data/dataset/"
img_list = os.listdir(os.path.join(ROOT_DIR,'trainnii'))
label_list = os.listdir(os.path.join(ROOT_DIR,'labelsnii'))
filename_pairs = [(os.path.join(ROOT_DIR,'trainnii',x),os.path.join(ROOT_DIR,'labelsnii',y)) for x,y in zip(img_list,label_list)]
print(filename_pairs)
train_transform = transforms.Compose([
mt_transforms.Resample(0.25, 0.25,0.25),
mt_transforms.ToTensor()]
)
filename_pairs = mt_datasets.MRI3DSubVolumeSegmentationDataset(filename_pairs, cache=True,
transform=train_transform, canonical=False, length=(64,64,64), padding=0)
train_dataset = mt_datasets.MRI3DSegmentationDataset (filename_pairs,cache= True , transform=train_transform ,canonical= False)
Hello banikr,
Can I access to “get_paired_patch_3D” function in your code?
Thanks.
Hi ptrblck, what should I do, if I want to extract overlapping patches, for example image is 25625632, and patch is 323232, with a step size of 4? Do you might have found any example or tutorial?
unfold
should work. Have a look at this post for an example.
Hey @Aliktk
There are different ways. Mostly overlapping and non-overlapping method.
def generate_patch_32_3(MR, Mask, cor, sag, axi):
"""
:param MR: 3D MR volume
:param Mask: 3D Mask same shape MR volume
:param cor:
:param sag:
:param axi:
:return: MR patch and corresponding Mask patch with shape[32,32,32] and [16,16,16]
"""
# cor = 16
hCor = np.int(cor/4)
# sag = 64
hSag = np.int(sag/4)
# axi = 64
hAxi = np.int(axi/4)
qShape = [96, 128, 128]
c = [0, MR.shape[0] - qShape[0]]
s = [0, MR.shape[1] - qShape[1]]
a = [0, MR.shape[2] - qShape[2]]
nQuad = len(c) * len(s) * len(a)
nPatch = np.int(nQuad * (qShape[0] / cor) * (qShape[1] / sag) * (qShape[2] / axi))
# print(nPatch)
MR_patch = np.zeros([nPatch, cor, sag, axi]).astype(np.float32)
Mask_patch = np.zeros([nPatch, np.int(cor/2), np.int(sag/2), np.int(axi/2)]).astype(np.int)
# print
patch_count = 0
quad = 0
for x in c:
for y in s:
for z in a:
MR_quad = MR[x:x + 96, y:y + 128, z:z + 128]
Mask_quad = Mask[x:x + 96, y:y + 128, z:z + 128]
quad += 1
for k in range(0, MR_quad.shape[0], cor): # stops when final slice
for i in np.arange(0, MR_quad.shape[1], sag):
for j in np.arange(0, MR_quad.shape[2], axi):
patch = MR_quad[k:k + cor, i:i + sag, j:j + axi]
# std_patch.append(np.max(patch))
# print(patch.shape, 'here')
MR_patch[patch_count, :, :, :] = patch
# std_batch.append(np.max(MR_patch))
patch = Mask_quad[k+hCor:k + cor-hCor, i+hSag:i + sag-hSag, j+hAxi:j + axi-hAxi]
# print('\t', patch.shape, 'here')
Mask_patch[patch_count, :, :, :] = patch
patch_count += 1
return MR_patch, Mask_patch, nPatch
Try the function here. You can avoid the Mask
variable if you don’t have one. The function works with loaded MR volume as NIfTI(.nii)
data. You can use nibabel
python library for that.
I have 131 CT volumes. All 131 samples have different numbers of slices. How can I build a dataloader function of different numbers of slices for each volume?
You could use a custom collate_fn
to return e.g. a list
containing tensors with a variable size, but the main question would be how you are planning to use these different number of sliced in the actual model training.
Assuming you are creating patches and thus are creating a new dimension in the input tensor, how would the model use these inputs for training? One possible approach would be to use a batch size of 1, but that’s often not desired.