Custom loss function based on external library

How to write a model / loss function where the loss is calculated in external tool.

I wrote a psedocode to illustrate what I want to achieve:

model = CustomModel().to(device) # Lets say 1000 inputs and 10 outputs

optimizer = optim.Adam(model.parameters(), lr=0.01)

inputs = next(iter(trainloader))

model_out = model(inputs) # 10 values that will be used to calculate loss

save_outs_to_file('model_out.txt', model_out)

proc = subprocess.Popen(["./calculate_loss", "-i", "model_out.txt"], stdout=subprocess.PIPE)
loss = proc.stdout.read()

# What to do next with this loss? 
# How to perform backprop?
# ... .backward()

optimizer.step()

Since you are leaving PyTorch, you would have to write a custom autograd.Function, as described here, and also implement the backward pass manually.
Autograd won’t be able to track the operations in the other process.

For what kind of applications would you need external tool to calculate the loss @adm ? Never encountered this kind of problem, just curious to know :slight_smile:

Thank you for your help. As far as I understand I will not be able to solve this in such form as I have to provide it with forward and backward function (what means, I have to perform all operations the external tool is doing so the pytorch will be able to compute the gradient). The problem is that I do not know what exactly the external tool does.

Than I doubt it’ll be possible to use it inside your forward pass. :confused:
Would it be possible to use an alternative, “open”, approach using a different library, so that you could at least see, which operations are performed?

Actually what came into my mind is that I can calculate the gradient numerically. I know the range of possible values and all are integers, so the epsilon will be 1. The question is how can I join it with the pytorch pipe? Let’s say, there is a model of X layers that outputs 10 parameters. Now I want to put this custom numerical loss on these 10 parameters.

You could define a custom autograd.Function and calculate the gradient in the backward method manually as described here.

Have you solved this problem?I am also faced with this problem recently.If you have some ideas, i am happy you share with me.Thank you! :grin: