Passing a model input to another model to generate Sketch

Hi Team,

I am using and Generator output(input contains both Edge and Original Image) and feeding to another network to produce sketch of the image to calculate the loss between the output and Sketch but unfortunately i am not able to generate required sketch.

###Generator model##
recon = model(real,edge,Target_color.float(),noise)
###Sketch model
fake_image = cycle_model(recon)

and then calulate mse loss between both
loss = criterion(fake_image,edge)

Here Edge is list of batch of 8 from dataloader
fake_image also is batch of 8 from the model

Could you please guid me the way i am passing input to sketch model is fine or not
If i display each and every image after this line

fake_image = cycle_model(recon)
fake_image_ex = fake_image.detach().cpu().data
for img in fake_image_ex:
plt.imshow(np.transpose(img,(1,2,0)))
plt.show()
They are learnt after some epochs very well and displayed as a sketch

But when i am using the whole model for Referencing and generating Sketch i could find images are black in color .

for batch_idx, img in enumerate(zip(loaders[‘test_rgbimage_dataloader’],loaders[‘test_grayscale_dataloader’])):
with torch.no_grad():
real,edge = img[0], img[1]
real = real[0].to(device)
edge = edge[0].to(device)
recon = trained_model(real,edge,Target_color.float(),noise)
images = recon.detach().cpu().data
for img in images:
imshow(img)

What could be the problem here