def forward(self, x):
x = self.conv1(x)
x = F.relu(x)
x = self.conv2(x)
x = F.max_pool2d(x, 2)
x = self.dropout1(x)
x = torch.flatten(x, 1)
x = self.fc1(x)
x = F.relu(x)
x = self.dropout2(x)
x = self.fc2(x)
output = F.log_softmax(x, dim=1)
return output
The main function as below
import Net1
net = Net1.Net()
optimizer = torch.optim.Adam(net.parameters(), lr=0.05)
the two problems come from the above code
(1) I want to use the Adadelta as the optimizer, but torch.optim.Adadelta cannot find
(2) Despite I use Adam as the optimizer, but the net.parameters() cannot found too.
I works in pyCharm, so I install torch in the pyCharm environment, I thought that maybe because the pyCharm cannot install the torch code completely, so I use the pip3 method to install the torch. And the problems is still here.
Please help me! Thank you very very much.
Thank you very much. I make spelling mistake about Adadelta in this place. But in my code the word is right. There is no waringing or error when I running the code. Sorry, I just find there is a underline belowe the “Adadelta” and “parameters()” with the index information. I am worried about this index information would result in some mistakes that cannot be found in syntax.
Thank you again.
What kind of index information do you get?
I guess it might be a warning by PyCharm, as it probably cannot resolve some symbols and thus could yield some warnings. Could this be the case?
Thank you very much.
(1)The index information for net.parameters() is “Cannot find reference ‘parameters’ in (input:Any,…),kwargs:dict)->Any”
(2) The index information for “.Adadelta” is “Cannont find reference ‘Adadelta’ in ini .pyi”.
I find the /usr/local/lib/python3.6/dist-packages/dis-packages/torch/optim/ ini .pyi, The code is
from .sgd import SGD as SGD
from .adam import Adam as Adam
from . import lr_scheduler as lr_scheduler
there is no Adadelta in the file. So, if my version is incomplete or not? I tred to install the torch by two ways: in the Pycharm intall the package, and in terminal input “sudo pip3 install torch”. The result is the same.
And for the net.parameters() problem, I guess if there need define a method for the parameters in my Network. In the before version, there is no need, but now I donnot know.
Despite the index information in pyCharm, but there is no warining or error during running. But yestoday, I trid to train the Network, the result implys there is no any improvement of my Network, so I doubt that the absent optimizer and the parameters() method.
Thank you very much. I do my best to solve the problem, but my knowledge for the pytorch is too little to do that. Please help me. Thank you again.