Even with setting `allow_unused = True` in `torch.autograd.grad()`, I get an error

Hi, Running the following code, I get an error like this:

torch.autograd.grad(cross_entropy_loss, model.omninet_first.parameters(), allow_unused=True)

The error:
`
File “/Users/ghaniebrahimi/opt/anaconda3/lib/python3.8/site-packages/torch/autograd/init.py”, line 190, in grad
return Variable._execution_engine.run_backward(

RuntimeError: One of the differentiated Tensors does not require grad`

Don’t we do allow_unused = True, to bypass this error? What is the problem here?