Ask Help for optimizer for Torch 1.3.0

I install the Torch 1.3.0. I think it is different with my before version. Now I meet two problem about optimal
My network is in

class Net(nn.Module):
def init(self):
super(Net, self).init()
self.conv1 = nn.Conv2d(1, 32, 3, 1)
self.conv2 = nn.Conv2d(32, 64, 3, 1)
self.dropout1 = nn.Dropout2d(0.25)
self.dropout2 = nn.Dropout2d(0.5)
self.fc1 = nn.Linear(9216, 128)
self.fc2 = nn.Linear(128, 10)

def forward(self, x):
    x = self.conv1(x)
    x = F.relu(x)
    x = self.conv2(x)
    x = F.max_pool2d(x, 2)
    x = self.dropout1(x)
    x = torch.flatten(x, 1)
    x = self.fc1(x)
    x = F.relu(x)
    x = self.dropout2(x)
    x = self.fc2(x)
    output = F.log_softmax(x, dim=1)
    return output

The main function as below
import Net1
net = Net1.Net()
optimizer = torch.optim.Adam(net.parameters(), lr=0.05)

the two problems come from the above code
(1) I want to use the Adadelta as the optimizer, but torch.optim.Adadelta cannot find
(2) Despite I use Adam as the optimizer, but the net.parameters() cannot found too.
I works in pyCharm, so I install torch in the pyCharm environment, I thought that maybe because the pyCharm cannot install the torch code completely, so I use the pip3 method to install the torch. And the problems is still here.
Please help me! Thank you very very much.

  1. What kind of error message do you get? torch.optim.Adadelta is available in 1.3.0.

  2. Again, what kind of error do you get that you think the parameters cannot be found?

Thank you very much for your reply.
Acctrually, the error is the same.

for the following code :
optimizer = torch.optim.Adadelt(net.parameters(), lr=0.05)

under the “Adadelt” there is underlines, and say:“Cannot find reference ‘Adadelt’ in init .py”. And I check the “ init .py”, the Adadelt is not here

under the “parameters()” the underlines says"Cannont find reference ‘parameters’ in (input:(Any,…))"

Maybe you just have a typo and you have written Adadelt instead of Adadelta?

However, is this just a warning from your IDE or a real error?

Thank you very much. I make spelling mistake about Adadelta in this place. But in my code the word is right. There is no waringing or error when I running the code. Sorry, I just find there is a underline belowe the “Adadelta” and “parameters()” with the index information. I am worried about this index information would result in some mistakes that cannot be found in syntax.
Thank you again.

What kind of index information do you get?
I guess it might be a warning by PyCharm, as it probably cannot resolve some symbols and thus could yield some warnings. Could this be the case?

Thank you very much.
(1)The index information for net.parameters() is “Cannot find reference ‘parameters’ in (input:Any,…),kwargs:dict)->Any”
(2) The index information for “.Adadelta” is “Cannont find reference ‘Adadelta’ in ini .pyi”.

I find the /usr/local/lib/python3.6/dist-packages/dis-packages/torch/optim/ ini .pyi, The code is
from .sgd import SGD as SGD
from .adam import Adam as Adam
from . import lr_scheduler as lr_scheduler

there is no Adadelta in the file. So, if my version is incomplete or not? I tred to install the torch by two ways: in the Pycharm intall the package, and in terminal input “sudo pip3 install torch”. The result is the same.

And for the net.parameters() problem, I guess if there need define a method for the parameters in my Network. In the before version, there is no need, but now I donnot know.

Despite the index information in pyCharm, but there is no warining or error during running. But yestoday, I trid to train the Network, the result implys there is no any improvement of my Network, so I doubt that the absent optimizer and the parameters() method.

Thank you very much. I do my best to solve the problem, but my knowledge for the pytorch is too little to do that. Please help me. Thank you again.

These are just warnings which you could ignore for now. We had a recent discussion about some improvements here in the forum, but I cannot find it.

As long as your code doesn’t throw any errors or warnings, you should be fine.

If I interpret the warning correctly, pycharm just cannot find the docs and references for these methods.