meyro123
(meyro123)
October 26, 2020, 3:52pm
#1
Hi everyone,

I’m currently trying to train a physics informed NN and I have a (probably very) simple question regarding autograd.
My training samples are points in space and time, i.e. a two-dimensional tensor and my question is as follows. When I compute the gradient wrt. the training data, i.e.

gradient = torch.autograd.grad(u,
trainingData,
create_graph=True,
allow_unused=True)

and I want to obtain the time derivative (first component of training data) of u, does the time derivative coincide with this quantity:

dtU = gradient[0][0]?

And similarly, If I want to obtain the spatial derivative, would It be this quantity:

dxU = gradient[0][1]?

Thank you very much for your help!

Best
Fabian

KFrank
(K. Frank)
October 26, 2020, 5:42pm
#2
Hi Fabian!

I don’t understand what you are saying here.

The following example script may answer part of your question:

```
import torch
torch.__version__
def f (x, y):
return x * y
x = torch.Tensor ([2.0])
x.requires_grad = True
y = torch.Tensor ([3.0])
y.requires_grad = True
u = f (x, y)
gradient = torch.autograd.grad (u, (x, y), create_graph=True, allow_unused=True)
gradient[0]
gradient[1]
```

Here is its output:

```
>>> import torch
>>> torch.__version__
'1.6.0'
>>>
>>> def f (x, y):
... return x * y
...
>>> x = torch.Tensor ([2.0])
>>> x.requires_grad = True
>>> y = torch.Tensor ([3.0])
>>> y.requires_grad = True
>>> u = f (x, y)
>>> gradient = torch.autograd.grad (u, (x, y), create_graph=True, allow_unused=True)
>>> gradient[0]
tensor([3.], grad_fn=<MulBackward0>)
>>> gradient[1]
tensor([2.], grad_fn=<MulBackward0>)
```

Best.

K. Frank

meyro123
(meyro123)
October 26, 2020, 6:22pm
#3
Hi Frank,

thank you very much for your explanation, it is indeed very helpful!

Sorry, I was very unspecific. I wanted to say that the input for my NN is simply (t,x)\in \R^2.

Again, thank you very much.

Best
Fabian