Hi, the dataloader you have printed is fine, these are reference to the dataloader objects. Please check the length of dataloaders
object, easy way can be
print(len(dataloaders['train']))
# OR
for i, data in enumerate(dataloaders['train']):
pass
print("Length of dataloader:", i)
The training code looks perfect, but your loss
variable is defined only into the loop. And the loop runs number of batches present in dataloader.
The loop is not running for certain dataloader, and loss
variable is left unassigned.
You can initialize the loss
variable outside the for loop, but that wont solve the issue. It will just stop throwing the error.
Hope it helps.