I can’t repeat your error, my idea is that your problem comes from your data set:
model = Net()
optimizer = optim.Adam(model.parameters(), 0.01)
criterion = nn.MSELoss()
def train(epochs):
model.train()
for epoch in range(1, epochs+1):
# I dont have your data set:
data = Variable(torch.rand(1,100))
target = Variable(torch.ones(1,100))
optimizer.zero_grad()
output = model(data)
loss = criterion(output, target)
loss.backward()
optimizer.step()
s = torch.sum(model.fc2.weight.data)
print(s)
train(100)
this gives
2.79897637193
5.36990540938
7.54833696394
9.55001756023
11.2473634735
13.7405790995
16.014156151
18.0902192594
20.0432560109
22.2485624477
24.2093355584
25.687371045
27.022266482
…