Pytorch molecular dynamics

Wrote a working 50-line pytorch protein molecular dynamics engine in numpy autograd which I am now porting to pytorch

Here’s the code:

!pip install autograd
!pip install celluloid

import matplotlib.pyplot as plt
import numpy as np
import torch

n_molecules = 50
n_atoms_per_molecule = 2
box_size = 20
temperature = 0.5

def energy(positions):
    "Compute the energy of a Lennard-Jones system."
    sigma = 1.0
    epsilon = 1.0
    rc = 3 * sigma
    e0 = 4 * epsilon * ((sigma / rc)**12 - (sigma / rc)**6)
    natoms = positions.shape[0]
    a, b = torch.triu_indices(natoms, natoms, 1)
    d = positions[a] - positions[b]
    r2 = torch.sum(d**2, dim=1)
    c6 = (sigma**2 / r2)**3
    energy = -e0 * (c6 != 0.0).sum()
    c12 = c6**2
    energy += torch.sum(4 * epsilon * (c12 - c6))
    return energy
    
def total_energy(positions):
    return energy(positions)

velocities = torch.tensor(np.random.uniform(low=-10, high=10, size = (n_molecules,  2)), requires_grad=True)
positions = torch.tensor(np.random.uniform(low=-box_size/2, high=box_size/2, size = (n_molecules, 2)), requires_grad=True)

from celluloid import Camera
camera = Camera(plt.figure())

for t in range(10000):
    energies = total_energy(positions)
    energies.backward()

    with torch.no_grad():
        forces = -positions.grad
        velocities = velocities +forces*1e-2
        #velocities = velocities * torch.sqrt(temperature/ torch.mean(velocities**2)) #thermostat
        positions= positions + velocities*1e-2
    
anim = camera.animate(blit=True, interval=50)
plt.style.use("dark_background")
anim.save('output.mp4')

However I get an error of

RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn

Sorry I don’t really have time to look up this code so can u pls indicate the line where this error is coming from I’m sure it’s printed on the console too.

Ok I think it’s in the place u typed ‘-positions.grad’
Remove the ‘.grad’ because u already specified “with torch.no_grad( )” and this automatically removes all gradients from all tensors so it’s like calling the gradient that doesn’t even exist

If u want to use the gradient of that tensor what u should do is to set a variable equal to that gradient outside the with statement and then call it in the with statement