How to do scalar operation on grad of a model

Hello, I want to do some scalar operation on gradient of model, such as plus gradient of two point/multiplied a gradient with a float number. How to do it?

I am not optimization the model, so I do not use optimizer.

What I want to do is something like the following

model(x1).backward()
g1 = [i.grad for i in model.parameters()]
model(x2).backward()
g2 = ...
...
result = g1 * 4.542 + g2 * 5.3848 + ...

Is there any easy way to pack the gradient of a model as a whole to do scalar operation? Or i have to wrap it manually by myself?.

I want to keep the line

result = ...

simple, because this code is also used in other things beside this pytorch model.