Variable declared in nn.Parameter is not shown in model.parameters output

Hi!

I implemented a model where one variable is defined inside the function nn.Parameter() in order to be used during the optimization process. However, when I request the list of variables defined in the model, that variable is not displayed.

Here is the class, in this case, the variable I need to optimize is W_ij:

class CapsNet(nn.Module):
	def __init__(self, r = 1):
		super(CapsNet, self).__init__()
		self.ReLUConv1 = nn.Sequential(
			nn.Conv2d(in_channels=1, out_channels=256, kernel_size=9, stride=1),
			nn.ReLU(inplace=True)
		)
		self.W_ij = nn.Parameter(torch.rand((32, 10, 6*6, 16, 8), requires_grad = True))
		self.PrimaryCaps = nn.ModuleList()
		self.r = r
		for _ in range(32):
			self.PrimaryCaps.append(nn.Conv2d(in_channels=256, out_channels=8, kernel_size=9, stride=2))
		self.decoder = nn.Sequential(
			nn.Linear(16*10, 512),
			nn.ReLU(inplace=True),
			nn.Linear(512, 1024),
			nn.ReLU(inplace=True),
			nn.Linear(1024, 784),
			nn.Sigmoid()
		)

	def forward(self)...

And here is the list of variables displayed by the model parameters output:

model = CapsNet()
model.parameters

<bound method Module.parameters of CapsNet(
  (ReLUConv1): Sequential(
	(0): Conv2d(1, 256, kernel_size=(9, 9), stride=(1, 1))
	(1): ReLU(inplace)
  )
  (PrimaryCaps): ModuleList(
	(0): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(1): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(2): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(3): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(4): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(5): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(6): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(7): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(8): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(9): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(10): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(11): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(12): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(13): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(14): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(15): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(16): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(17): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(18): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(19): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(20): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(21): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(22): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(23): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(24): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(25): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(26): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(27): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(28): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(29): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(30): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
	(31): Conv2d(256, 8, kernel_size=(9, 9), stride=(2, 2))
  )
  (decoder): Sequential(
	(0): Linear(in_features=160, out_features=512, bias=True)
	(1): ReLU(inplace)
	(2): Linear(in_features=512, out_features=1024, bias=True)
	(3): ReLU(inplace)
	(4): Linear(in_features=1024, out_features=784, bias=True)
	(5): Sigmoid()
  )
)>

Only the variables defined in the sequential functions are defined as parameters of my model, but not the variable W_ij. In this regards, I have two questions:

  1. How I can include the variable W_ij as a member of the set of parameters to be updated in the optimization step. So far, the loss function does not change during that part and I believe this is the reason.
  2. Do I have to set requires_grad = True in the definition of the variable despite it is included in the nn.Parameter function?

Any suggestions?

Henry

1 Like

Could you try to print the parameters using:

print(list(model.parameters()))

Hi @ptrblck:

Here is part of the output because is larger than the number of characters allowed:

[Parameter containing:
tensor([[[[[7.6739e-01, 5.7694e-01, 1.4690e-01,  ..., 8.4265e-01,
			1.0182e-02, 5.0916e-01],
		   [5.8372e-01, 7.2278e-01, 8.8761e-01,  ..., 3.6188e-01,
			9.8832e-02, 7.6833e-01],
		   [1.7700e-01, 4.6641e-01, 8.7544e-01,  ..., 1.2639e-01,
			4.7846e-01, 5.5806e-01],
		   ...,
		   [7.1373e-01, 4.6280e-01, 7.5365e-01,  ..., 4.9755e-01,
			8.8003e-01, 9.3670e-01],
		   [5.3323e-01, 4.0508e-01, 4.6126e-01,  ..., 3.7920e-01,
			7.5806e-01, 4.8837e-01],
		   [5.3375e-02, 2.3048e-01, 3.9690e-01,  ..., 1.2138e-01,
			4.3386e-01, 3.4856e-01]],

		  [[4.5299e-01, 3.3679e-01, 4.0874e-01,  ..., 7.7343e-01,
			9.1206e-01, 1.8913e-01],
		   [3.6609e-01, 3.6512e-01, 5.8318e-01,  ..., 5.7687e-01,
			6.6697e-01, 5.4229e-02],
		   [8.8892e-01, 4.5645e-02, 4.1808e-01,  ..., 3.7985e-01,
			3.3769e-01, 6.7484e-01],
		   ...,
		   [5.0250e-01, 1.1942e-01, 3.0485e-01,  ..., 4.9193e-01,
			6.6392e-01, 1.7286e-01],
		   [1.4054e-01, 5.4048e-01, 4.5034e-02,  ..., 5.4134e-01,
			1.1507e-01, 4.3528e-01],
		   [4.1365e-01, 3.7377e-01, 8.4924e-01,  ..., 5.0650e-01,
			6.7584e-01, 8.7766e-01]],

		  [[3.2468e-01, 1.3789e-01, 1.9836e-01,  ..., 6.3810e-01,
			3.7559e-01, 7.9397e-01],
		   [3.1314e-01, 9.2096e-01, 7.8847e-01,  ..., 6.8551e-01,
			8.1996e-01, 2.2501e-01],
		   [4.4605e-01, 4.3197e-01, 3.0152e-01,  ..., 1.6773e-01,
			2.6052e-01, 4.0138e-01],
		   ...,
		   [1.4110e-01, 3.6810e-01, 1.4151e-01,  ..., 4.0450e-01,
			8.0524e-01, 1.7194e-01],
		   [3.4931e-01, 6.4747e-01, 8.5577e-01,  ..., 7.8586e-01,
			1.7757e-01, 9.3789e-01],
		   [4.0471e-01, 9.2119e-01, 2.8132e-02,  ..., 9.2249e-01,
			8.8225e-01, 3.8468e-01]],

		  ...,

		  [[8.8568e-01, 7.2598e-01, 4.1293e-02,  ..., 1.0657e-01,
			8.1734e-01, 3.4054e-01],
		   [1.3737e-01, 6.8914e-01, 4.7814e-01,  ..., 2.0132e-02,
			4.1065e-01, 9.6494e-01],
		   [3.2537e-01, 1.7425e-02, 1.9855e-01,  ..., 2.5449e-01,
			6.8413e-01, 8.9184e-02],
		   ...,
		   [4.0885e-01, 8.8580e-01, 4.7947e-01,  ..., 8.4072e-01,
			1.7515e-01, 9.2421e-03],
		   [2.8516e-01, 3.0517e-01, 6.1949e-01,  ..., 9.6947e-01,
			3.4296e-01, 2.5695e-01],
		   [4.7322e-01, 2.8092e-02, 8.4044e-01,  ..., 8.8433e-01,
			1.0560e-01, 1.9199e-01]],

		  [[5.4768e-01, 1.4706e-01, 7.2479e-01,  ..., 5.7813e-01,
			9.4183e-01, 1.6523e-01],
		   [8.5458e-01, 2.9474e-01, 7.6239e-01,  ..., 8.4729e-01,
			9.4218e-01, 7.7710e-01],
		   [3.9862e-01, 4.3010e-01, 5.2872e-01,  ..., 5.6694e-01,
			3.3170e-01, 9.6968e-01],
		   ...,
		   [6.9340e-02, 6.8474e-01, 9.6966e-01,  ..., 9.2907e-01,
			4.6483e-01, 3.6082e-01],
		   [8.2517e-02, 9.0183e-01, 8.5331e-01,  ..., 5.2363e-01,
			7.5155e-01, 7.1173e-01],
		   [1.4189e-01, 9.0067e-01, 7.9025e-01,  ..., 9.7414e-01,
			1.3880e-01, 9.1656e-01]],

		  [[4.4559e-01, 7.4797e-01, 3.7663e-01,  ..., 4.4997e-01,
			7.5805e-01, 1.4281e-02],
		   [7.1061e-01, 1.3400e-01, 3.4829e-01,  ..., 4.0531e-01,
			7.6288e-01, 9.3623e-03],
		   [7.3082e-01, 1.6582e-01, 9.1881e-01,  ..., 2.0254e-02,
			5.2745e-01, 1.7543e-01],
		   ...,
		   [1.2445e-01, 6.4993e-01, 7.2403e-01,  ..., 1.0495e-01,
			8.2953e-01, 8.4761e-01],
		   [2.2911e-01, 1.7264e-01, 7.1898e-01,  ..., 1.3536e-01,
			2.9724e-01, 2.1592e-01],
		   [7.1262e-03, 3.1743e-01, 3.5177e-02,  ..., 9.6428e-01,
			2.6071e-01, 9.1934e-01]]],


		 [[[4.1528e-01, 2.5934e-01, 7.9593e-01,  ..., 8.9152e-01,
			7.7328e-01, 5.6743e-01],
		   [7.9241e-01, 1.9825e-01, 5.7067e-01,  ..., 7.4108e-01,
			3.2185e-01, 4.3047e-01],
		   [8.9096e-01, 3.9115e-01, 6.8938e-01,  ..., 7.1494e-01,
			1.4202e-01, 7.4137e-01],
		   ...,
		   [7.9495e-01, 7.3813e-01, 2.4878e-01,  ..., 4.3278e-01,
			5.1972e-01, 5.1730e-01],
		   [2.9028e-01, 1.6429e-01, 5.8873e-01,  ..., 6.6769e-01,
			5.7929e-01, 3.6350e-01],
		   [6.8588e-01, 5.2247e-01, 1.9651e-01,  ..., 9.9136e-01,
			4.4651e-01, 6.3356e-01]],

		  [[2.0207e-01, 5.5784e-01, 4.7625e-01,  ..., 4.2619e-02,
			9.9390e-01, 8.2427e-01],
		   [1.3512e-01, 7.1366e-01, 3.0682e-01,  ..., 5.2905e-01,
			7.0548e-01, 3.7589e-01],
		   [3.9355e-01, 7.6408e-01, 2.8698e-01,  ..., 3.3699e-01,
			1.1217e-02, 7.8939e-01],
		   ...,
		   [1.7819e-01, 7.4309e-01, 6.5005e-01,  ..., 2.3578e-01,
			4.5845e-01, 6.0670e-01],
		   [5.8735e-02, 1.4817e-02, 2.4480e-01,  ..., 6.4879e-01,
			1.9488e-01, 6.4382e-01],
		   [7.9225e-01, 7.8734e-01, 5.3828e-02,  ..., 3.0421e-02,
			5.1285e-01, 5.9031e-02]],

		  [[5.7578e-01, 8.2437e-01, 6.7843e-01,  ..., 6.5092e-01,
			5.2705e-01, 7.0242e-02],
		   [3.6100e-01, 6.9821e-01, 6.8266e-01,  ..., 9.9500e-01,
			1.7203e-03, 3.6993e-01],
		   [2.3840e-01, 3.5518e-01, 5.6097e-02,  ..., 4.5848e-01,
			5.1774e-01, 5.7378e-01],
		   ...,
		   [5.9616e-01, 7.2055e-01, 5.1226e-01,  ..., 1.6518e-01,
			8.6783e-01, 2.9836e-01],
		   [8.3414e-01, 5.8019e-01, 6.2259e-01,  ..., 2.7784e-01,
			5.0720e-01, 5.6435e-01],
		   [5.9361e-01, 2.3667e-01, 9.4878e-01,  ..., 6.9894e-02,
			3.2900e-01, 5.0268e-01]],

		  ...,

		  [[5.9668e-01, 1.8945e-02, 7.3882e-01,  ..., 1.6845e-01,
			2.7940e-01, 3.1264e-01],
		   [8.7321e-02, 5.1667e-01, 1.5205e-02,  ..., 3.9717e-01,
			1.6481e-02, 7.3159e-01],
		   [8.0506e-01, 8.7750e-01, 5.5920e-01,  ..., 9.0319e-01,
			2.0946e-01, 1.1467e-01],
		   ...,
		   [2.8784e-01, 8.4264e-02, 9.8612e-01,  ..., 7.0642e-01,
			7.5766e-01, 5.6043e-01],
		   [3.5024e-01, 4.3340e-01, 5.8842e-01,  ..., 5.2658e-01,
			3.8618e-01, 7.4563e-01],
		   [9.5279e-01, 2.0545e-01, 3.0640e-01,  ..., 8.6450e-01,
			2.7790e-01, 3.3090e-01]],

		  [[3.2479e-01, 7.5468e-01, 5.2824e-01,  ..., 4.0495e-02,
			4.0978e-01, 8.5620e-01],
		   [4.3639e-01, 5.1542e-01, 4.6560e-01,  ..., 6.2835e-01,
			9.0295e-01, 3.5546e-01],
		   [4.6104e-01, 3.1599e-01, 3.6135e-02,  ..., 3.3267e-01,
			6.4670e-01, 8.5443e-01],
		   ...,
		   [6.3489e-01, 8.9880e-02, 5.4958e-02,  ..., 4.7443e-01,
			8.0634e-01, 8.3051e-01],
		   [4.0276e-01, 2.4590e-01, 2.0508e-01,  ..., 9.1851e-01,
			5.1149e-01, 4.3412e-01],
		   [1.5342e-03, 1.8887e-01, 2.1114e-01,  ..., 9.6045e-02,
			2.0662e-01, 4.6233e-01]],

		  [[7.3246e-01, 4.2454e-01, 6.3876e-01,  ..., 4.6388e-01,
			6.6292e-01, 7.6189e-01],
		   [9.1187e-01, 5.4363e-01, 5.4122e-01,  ..., 1.5614e-01,
			6.9637e-01, 1.4138e-01],
		   [1.7490e-01, 2.1351e-01, 8.4375e-01,  ..., 5.0843e-01,
			9.5231e-01, 2.9017e-01],
		   ...,
		   [9.4328e-01, 7.4659e-01, 9.0242e-01,  ..., 1.7258e-01,
			2.8283e-01, 8.7365e-01],
		   [3.9166e-01, 7.4339e-01, 1.7533e-01,  ..., 3.8704e-01,
			1.5566e-01, 6.7763e-02],
		   [7.5075e-01, 1.2519e-01, 8.7407e-02,  ..., 2.5569e-01,
			7.1487e-01, 5.2214e-01]]],


		 [[[8.6069e-01, 3.0978e-01, 2.6947e-01,  ..., 8.8378e-01,
			2.0586e-01, 2.4953e-01],
		   [4.9243e-01, 5.7869e-01, 6.8052e-01,  ..., 6.4253e-01,
			3.3459e-01, 4.3123e-02],
		   [8.4683e-01, 2.0274e-01, 7.2146e-01,  ..., 5.8167e-01,
			3.4023e-01, 8.2120e-01],
		   ...,
		   [7.2076e-01, 5.7922e-01, 5.3045e-01,  ..., 4.4999e-01,
			3.1915e-01, 9.4792e-01],
		   [8.5810e-02, 9.8088e-01, 2.8822e-01,  ..., 2.4828e-01,
			5.9598e-01, 6.7687e-01],
		   [7.3960e-01, 1.3854e-01, 1.7701e-01,  ..., 4.7463e-01,
			8.7126e-03, 4.5565e-01]],

		  [[9.6817e-01, 3.5741e-01, 1.5915e-01,  ..., 2.2334e-01,
			2.2585e-01, 4.2289e-02],
		   [9.7334e-01, 9.7148e-01, 7.7071e-01,  ..., 6.4645e-01,
			4.1084e-01, 5.8376e-01],
		   [3.0449e-01, 4.0146e-01, 9.2943e-01,  ..., 8.4704e-01,
			6.9278e-01, 4.3482e-01],
		   ...,
		   [7.4013e-01, 6.5953e-01, 8.9109e-01,  ..., 8.8038e-01,
			9.1634e-02, 8.0954e-01],
		   [7.8204e-01, 3.8120e-01, 1.0622e-01,  ..., 6.0812e-01,
			3.3097e-02, 4.0971e-03],
		   [5.7029e-03, 2.9170e-01, 1.4529e-02,  ..., 5.3710e-01,
			9.9842e-03, 5.4186e-01]],

		  [[7.6594e-01, 9.5179e-01, 4.7323e-01,  ..., 1.4289e-01,
			3.7384e-01, 8.2052e-01],
		   [4.9726e-01, 5.7614e-01, 3.8949e-01,  ..., 5.6762e-01,
			8.2138e-01, 3.3427e-01],
		   [8.1521e-01, 8.3014e-01, 7.4531e-01,  ..., 4.6001e-01,
			3.4006e-01, 4.3338e-01],
		   ...,
		   [5.3341e-01, 4.2412e-01, 4.1900e-01,  ..., 9.3899e-01,
			5.7563e-02, 4.1517e-01],
		   [2.4708e-01, 3.8884e-01, 3.6465e-01,  ..., 4.6556e-01,
			5.2193e-02, 7.1592e-01],
		   [9.0153e-01, 5.5619e-02, 7.6606e-01,  ..., 5.6541e-01,
			2.7944e-01, 9.8611e-01]],

		  ...,

		  [[5.5920e-02, 5.4057e-02, 7.8622e-01,  ..., 7.3954e-01,
			5.8098e-01, 8.3185e-01],
		   [2.0571e-01, 6.4555e-01, 7.4586e-01,  ..., 8.1000e-01,
			8.9614e-01, 7.8818e-02],
		   [5.9937e-01, 2.9032e-01, 7.9013e-01,  ..., 8.8225e-01,
			5.3767e-01, 2.5079e-02],
		   ...,
		   [2.4588e-01, 9.7691e-01, 8.1896e-01,  ..., 5.9354e-01,
			3.5735e-01, 6.8635e-01],
		   [3.0577e-01, 8.1374e-01, 5.8728e-01,  ..., 4.0778e-01,
			1.0003e-01, 2.9440e-01],
		   [6.7664e-01, 1.3860e-01, 1.4878e-01,  ..., 9.8078e-02,
			6.8442e-02, 7.3116e-01]],

		  [[3.5372e-01, 1.7224e-01, 6.5888e-01,  ..., 6.5219e-02,
			9.8796e-01, 7.1238e-01],
		   [4.2807e-01, 2.1368e-01, 8.6156e-01,  ..., 9.3650e-01,
			8.0852e-01, 7.2590e-01],
		   [2.3365e-01, 7.5662e-01, 1.5440e-01,  ..., 3.9711e-02,
			7.2361e-01, 6.0657e-01],
		   ...,
		   [9.2539e-02, 9.3710e-01, 2.0238e-01,  ..., 1.1861e-01,
			1.0211e-01, 8.5345e-01],
		   [4.7541e-02, 5.3932e-01, 1.7160e-01,  ..., 6.5774e-01,
			5.3190e-01, 3.4805e-01],
		   [8.0904e-01, 7.6469e-01, 2.4000e-01,  ..., 3.4909e-01,
			6.9401e-01, 8.1628e-01]],

		  [[7.4898e-01, 8.6198e-01, 2.7199e-01,  ..., 1.7970e-01,
			4.8535e-01, 9.6696e-02],
		   [5.6890e-01, 2.5746e-01, 9.2011e-01,  ..., 8.0284e-01,
			9.1778e-01, 6.9542e-01],
		   [8.8634e-01, 3.3647e-01, 7.8304e-01,  ..., 9.5858e-01,
			8.6181e-01, 7.0420e-01],
		   ...,
		   [1.1258e-01, 2.9942e-01, 7.0951e-01,  ..., 6.7769e-01,
			7.7942e-02, 7.8483e-01],
		   [9.6725e-01, 1.1985e-01, 7.3592e-02,  ..., 4.0335e-01,
			7.1557e-01, 5.9206e-01],
		   [2.7142e-01, 1.0375e-01, 6.8905e-01,  ..., 9.1768e-01,
			7.1096e-01, 5.4644e-01]]],


		 ...,


		 [[[2.3049e-01, 2.2262e-01, 6.9831e-01,  ..., 3.4785e-01,
			1.9261e-01, 1.6485e-01],
		   [8.7370e-02, 4.0153e-01, 5.5550e-01,  ..., 8.7350e-01,
			6.5091e-01, 5.7027e-01],
		   [8.2581e-01, 8.0229e-01, 5.1905e-02,  ..., 8.4354e-03,
			5.5208e-01, 8.0015e-01],
		   ...,
		   [3.5276e-01, 1.7161e-02, 9.9382e-01,  ..., 5.7967e-01,
			6.4500e-01, 3.8972e-01],
		   [2.9985e-02, 4.9689e-01, 3.9898e-01,  ..., 9.8758e-01,
			4.6939e-01, 8.5264e-01],
		   [4.9803e-01, 8.2988e-01, 1.3325e-01,  ..., 3.8771e-02,
			5.5842e-01, 6.0082e-01]],

		  [[4.4922e-01, 1.4637e-01, 3.6825e-01,  ..., 3.8888e-01,
			9.9635e-01, 7.6231e-01],
		   [6.0588e-01, 5.4020e-02, 7.7364e-01,  ..., 7.0918e-01,
			3.1280e-01, 4.1706e-01],
		   [5.4832e-01, 5.0517e-01, 3.4220e-01,  ..., 7.1315e-01,
			4.6581e-01, 3.6415e-01],
		   ...,
		   [3.8192e-01, 4.6033e-01, 5.8473e-01,  ..., 6.5528e-01,
			2.8630e-01, 7.3145e-01],
		   [9.0302e-01, 9.5635e-01, 1.0854e-01,  ..., 1.2172e-01,
			8.8041e-01, 4.6316e-01],
		   [2.3110e-01, 3.6821e-01, 7.7418e-01,  ..., 6.3039e-01,
			9.4273e-01, 2.1212e-01]],

		  [[7.8398e-01, 7.0921e-01, 9.6487e-02,  ..., 8.9825e-01,
			5.1998e-01, 9.8542e-01],
		   [9.7818e-01, 5.7448e-01, 2.6835e-02,  ..., 7.5816e-01,
			3.2348e-01, 4.5573e-02],
		   [9.0839e-01, 1.1082e-01, 1.4642e-01,  ..., 8.3780e-01,
			8.9171e-01, 4.4667e-01],
		   ...,
		   [5.9776e-01, 9.3012e-01, 7.6937e-01,  ..., 3.2172e-01,
			3.7485e-01, 2.8772e-01],
		   [3.4350e-01, 6.0984e-01, 3.2810e-01,  ..., 4.6941e-01,
			3.6490e-01, 8.4864e-02],
		   [7.5752e-01, 5.6821e-01, 3.1447e-01,  ..., 8.8520e-01,
			9.6437e-01, 3.7760e-01]],

		  ...,

		  [[7.1025e-02, 7.6608e-01, 2.3979e-01,  ..., 8.0521e-02,
			4.2642e-01, 1.3974e-02],
		   [1.8102e-01, 4.2851e-01, 5.1516e-01,  ..., 7.2275e-01,
			3.9572e-01, 3.0240e-01],
		   [1.6883e-01, 1.5237e-01, 1.8105e-01,  ..., 8.7423e-01,
			6.8085e-01, 6.6449e-01],
		   ...,
		   [4.6055e-01, 9.7129e-01, 5.5012e-01,  ..., 1.2927e-01,
			1.9321e-01, 5.7518e-01],
		   [2.5797e-04, 3.2652e-01, 4.4737e-01,  ..., 8.0914e-01,
			5.6270e-01, 6.4787e-01],
		   [1.7741e-01, 5.4729e-01, 9.7928e-03,  ..., 3.7991e-01,
			7.8216e-01, 9.6301e-01]],

		  [[8.4330e-01, 2.1859e-01, 9.1057e-01,  ..., 1.6024e-01,
			6.5961e-01, 5.2498e-01],
		   [9.8151e-01, 3.6714e-01, 2.8921e-01,  ..., 8.2588e-01,
			3.7878e-02, 8.7757e-01],
		   [9.8103e-01, 5.2342e-01, 2.5119e-01,  ..., 5.1633e-01,
			1.6919e-01, 5.6062e-01],
		   ...,
		   [9.0493e-01, 8.4267e-01, 3.7056e-01,  ..., 6.0395e-01,
			6.1962e-01, 4.1659e-01],
		   [7.0052e-01, 8.7154e-01, 1.7824e-01,  ..., 5.4432e-02,
			5.0960e-01, 9.5224e-02],
		   [5.7251e-01, 8.0133e-01, 5.8870e-01,  ..., 3.0814e-01,
			7.8979e-01, 4.4498e-01]],

		  [[7.7022e-01, 9.3518e-01, 8.9363e-01,  ..., 6.5739e-01,
			5.6208e-01, 2.7751e-02],
		   [7.1668e-01, 8.3066e-02, 8.0086e-01,  ..., 3.1376e-01,
			8.2750e-01, 1.1812e-01],
		   [5.1732e-01, 8.6545e-01, 7.4015e-01,  ..., 2.9727e-02,
			9.6790e-01, 1.1910e-01],
		   ...,
		   [6.4426e-01, 6.2727e-01, 3.4956e-01,  ..., 8.2050e-01,
			2.2073e-01, 5.4220e-01],
		   [1.3593e-01, 1.4675e-01, 7.4674e-01,  ..., 8.3513e-01,
			1.7398e-01, 3.7785e-01],
		   [3.8345e-01, 4.9171e-01, 9.0600e-01,  ..., 1.4847e-01,
			9.7346e-01, 6.1814e-01]]],


		 [[[4.1622e-01, 1.5042e-02, 6.2890e-01,  ..., 5.4780e-01,
			6.1426e-01, 9.4636e-01],
		   [8.7305e-01, 9.5013e-01, 2.2473e-01,  ..., 9.7041e-01,
			9.1911e-01, 3.5189e-01],
		   [6.5247e-01, 3.0404e-01, 3.0104e-01,  ..., 8.4302e-01,
			4.1354e-01, 3.5731e-01],

I uploaded the output of the file in the following link in case you want to review the entire set of records

https://github.com/henrychacon/CapsNet/blob/master/parameters.txt

Hi @ptrblck, by the way, the version of Torch I am using it is: 1.0.1.post2

Thanks!
Henry

Probably it would be easier to just check the names, since the actual parameter values are quite big:

for name, _ in model.named_parameters():
    print(name)

Could you run this code and check if your parameter is there?

1 Like

Hi @ptrblck:

When I run the command:

for name, _ in model.named_parameters():
    print(name)

The parameter W_ij is displayed, however, it is not optimized during the learning process.

  • Is there any way that I explicitly can include that parameter into the optimization process?
  • Why the parameter is shown in the command named_parameter in the for loop only (not when I execute print(model.named_parameters)) and not in the model.parameters used as input of optimizer = Adam(model.parameters())?

Here is the output:

W_ij
ReLUConv1.0.weight
ReLUConv1.0.bias
PrimaryCaps.0.weight
PrimaryCaps.0.bias
PrimaryCaps.1.weight
PrimaryCaps.1.bias
PrimaryCaps.2.weight
PrimaryCaps.2.bias
PrimaryCaps.3.weight
PrimaryCaps.3.bias
PrimaryCaps.4.weight
PrimaryCaps.4.bias
PrimaryCaps.5.weight
PrimaryCaps.5.bias
PrimaryCaps.6.weight
PrimaryCaps.6.bias
PrimaryCaps.7.weight
PrimaryCaps.7.bias
PrimaryCaps.8.weight
PrimaryCaps.8.bias
PrimaryCaps.9.weight
PrimaryCaps.9.bias
PrimaryCaps.10.weight
PrimaryCaps.10.bias
PrimaryCaps.11.weight
PrimaryCaps.11.bias
PrimaryCaps.12.weight
PrimaryCaps.12.bias
PrimaryCaps.13.weight
PrimaryCaps.13.bias
PrimaryCaps.14.weight
PrimaryCaps.14.bias
PrimaryCaps.15.weight
PrimaryCaps.15.bias
PrimaryCaps.16.weight
PrimaryCaps.16.bias
PrimaryCaps.17.weight
PrimaryCaps.17.bias
PrimaryCaps.18.weight
PrimaryCaps.18.bias
PrimaryCaps.19.weight
PrimaryCaps.19.bias
PrimaryCaps.20.weight
PrimaryCaps.20.bias
PrimaryCaps.21.weight
PrimaryCaps.21.bias
PrimaryCaps.22.weight
PrimaryCaps.22.bias
PrimaryCaps.23.weight
PrimaryCaps.23.bias
PrimaryCaps.24.weight
PrimaryCaps.24.bias
PrimaryCaps.25.weight
PrimaryCaps.25.bias
PrimaryCaps.26.weight
PrimaryCaps.26.bias
PrimaryCaps.27.weight
PrimaryCaps.27.bias
PrimaryCaps.28.weight
PrimaryCaps.28.bias
PrimaryCaps.29.weight
PrimaryCaps.29.bias
PrimaryCaps.30.weight
PrimaryCaps.30.bias
PrimaryCaps.31.weight
PrimaryCaps.31.bias
decoder.0.weight
decoder.0.bias
decoder.2.weight
decoder.2.bias
decoder.4.weight
decoder.4.bias

If the parameter is shown in model.named_parameters(), it will also be in model.parameters(), since internally parameters() calls named_parameters() as shown here.

How do you check, if the parameter has been optimized? Do you substract it from itself? If yes, you may want to do an additional clone on the previous version because if you don’t clone it, the variable only holds a reference to this parameter , which means the difference will always be 0.

Hi @justusschock, in my model, I have two loss functions, one is associated with a matrix parameter defined with nn.parameter() and the other with a sequential CNN. In the first one, the loss value is constant during the optimization process, while in the second one no.

Hi @justusschock and @ptrblck: Thanks for your help. I changed the optimizer function from Adam to SGD and the loss function is minimized.