Padding tensors efficiently?

Is there an efficient way of padding a list of 3d tensors? The function below takes ~2.5 seconds to pad 50 tensors (L x 36 x 2048), where L varies between [3,7].

    def padFeats(tensors_to_pad, batch_size, pad_val):
        max_len = max([len(sent) for sent in tensors_to_pad])
        padded_tensor = torch.ones((batch_size,max_len,36,2048)) * pad_val
        pad_start = time.perf_counter()
        for i, val in enumerate(tensors_to_pad):
            padded_tensor[i,:len(val), ...] = torch.tensor(val)
        print("{} tensors were padded in {}".format(len(tensor_to_pad),pad_start - time.perf_counter()))
        return padded_tensor