Convert batch of index to mask

I have a batch of indexes with a dimension of [k, 4], and I would like to create a mask ([k, h, w]).

index = [[4, 4, 8, 8], [2, 2, 6, 6]]

mask = [[[0, 0, 0, 0, 0, 0, 0, 0],
         [0, 0, 0, 0, 0, 0, 0, 0],
         [0, 0, 0, 0, 0, 0, 0, 0],
         [0, 0, 0, 0, 0, 0, 0, 0],
         [0, 0, 0, 0, 1, 1, 1, 1],
         [0, 0, 0, 0, 1, 1, 1, 1],
         [0, 0, 0, 0, 1, 1, 1, 1],
         [0, 0, 0, 0, 1, 1, 1, 1],]
         [0, 0, 0, 0, 0, 0, 0, 0],
         [0, 0, 0, 0, 0, 0, 0, 0],
         [0, 0, 1, 1, 1, 1, 0, 0],
         [0, 0, 1, 1, 1, 1, 0, 0],
         [0, 0, 1, 1, 1, 1, 0, 0],
         [0, 0, 1, 1, 1, 1, 0, 0],
         [0, 0, 0, 0, 0, 0, 0, 0],
         [0, 0, 0, 0, 0, 0, 0, 0],]]

A for loop solution could be:

mask = torch.zeros(index.shape[0], 128, 128)
for m, i in zip(mask, index):
    m[i[0]:i[2], i[1]:i[3]] = 1

Try

k, h, w = 2, 8, 8
index = [[4, 4, 8, 8], [2, 2, 6, 6]]
hh, ww = torch.meshgrid(torch.arange(h), torch.arange(w))
hh, ww = hh.repeat(k, 1, 1), ww.repeat(k, 1, 1)
index = torch.tensor(index).long()
hh = torch.logical_and(index[:, 0, None, None] <= hh, hh < index[:, 2, None, None])
ww = torch.logical_and(index[:, 1, None, None] <= ww, ww < index[:, 3, None, None])
mask = torch.logical_and(hh, ww)
print(mask)

PS: i am not sure it is faster than for loop, test it by your self.