Hey Guys,

I used Timm pre-trained efficientNet to train on a dataset. My test dataset has around 281 samples both 0 and 1 classes included.

Here is my model params:

```
<bound method Module.parameters of EfficientNet(
(conv_stem): Conv2d(3, 32, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
(bn1): BatchNorm2d(32, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(act1): SiLU(inplace=True)
(blocks): Sequential(
(0): Sequential(
(0): EdgeResidual(
(conv_exp): Conv2d(32, 32, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn1): BatchNorm2d(32, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(act1): SiLU(inplace=True)
(se): Identity()
(conv_pwl): Conv2d(32, 32, kernel_size=(1, 1), stride=(1, 1), bias=False)
(bn2): BatchNorm2d(32, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
)
................... You can see the full model architetcure here as I am reaching the character limit for the post:
https://colab.research.google.com/drive/1rWaRA-jErgxecbD3jiXTeiHpzLafsiwE#scrollTo=xW-RaHSEnYMT
)>
```

I wanted to take my samples with best predictions in terms of accuracy and apply LIME on it. So, I wanted to produce individual predictions on each file. Then take the files with highest accuracy and apply LIME on it.

But when I try this I get a dimensionality error. Anyways to get across this with a workaround?

Here is the code I am getting the error on

```
list = testloader.dataset.samples
import PIL
for i in range(len(testloader.dataset.samples)):
sample_fname, _ = testloader.dataset.samples[i]
print(sample_fname)
img = sample_fname
image = PIL. Image. open(img)
model = efficientnetv2_model
transform = transforms.Compose([transforms.ToTensor()])
tensor = transform(image)
print(tensor.shape)
logits = model(tensor)
```

This is the error. My images are black and white images:

—> 12 logits = model(tensor)

RuntimeError: Given groups=1, weight of size [32, 3, 3, 3], expected input[1, 1, 128, 170] to have 3 channels, but got 1 channels instead

```
type or paste code here
```