And if I keep volatile=True in
batch = Variable(model.get_batch(sentences), volatile=True)
I get the below error and that was the main reason I was trying to set it to False or remove it altogether among other options that I listed in my previous reply/update on this question. I have also tried looking up for similar question both on this forum and StackOverflow.
RuntimeError Traceback (most recent call last)
<ipython-input-26-cf5dced5a7fc> in <module>()
58 loss = total_loss + regularizer
59
---> 60 loss.backward()
61 # loss.backward(retain_graph=True)
/anaconda/envs/py35/lib/python3.5/site-packages/torch/autograd/variable.py in backward(self, gradient, retain_graph, create_graph, retain_variables)
154 Variable.
155 """
--> 156 torch.autograd.backward(self, gradient, retain_graph, create_graph, retain_variables)
157
158 def register_hook(self, hook):
/anaconda/envs/py35/lib/python3.5/site-packages/torch/autograd/__init__.py in backward(variables, grad_variables, retain_graph, create_graph, retain_variables)
96
97 Variable._execution_engine.run_backward(
---> 98 variables, grad_variables, retain_graph)
99
100
RuntimeError: element 0 of variables tuple is volatile