How to replace layer in Sequential module

Hi Samuel!

Your code works for me (with an old 0.3.0 version of pytorch).

I copy-pasted your code and ran it:

>>> mm = MLP (2, 16, 3)
>>> mm
MLP(
  (dropout1): Dropout(p=0.2)
  (dropout2): Dropout(p=0.2)
  (linear1): Linear(in_features=2, out_features=16)
  (linear2): Linear(in_features=16, out_features=16)
  (linear3): Linear(in_features=16, out_features=3)
  (sequential_module): Sequential(
    (0): Linear(in_features=16, out_features=16)
    (1): ReLU()
    (2): Dropout(p=0.2)
  )
  (relu1): ReLU()
  (relu2): ReLU()
)
>>> replace_layer (mm)
>>> mm
MLP(
  (dropout1): Dropout(p=0.2)
  (dropout2): Dropout(p=0.2)
  (linear1): Linear(in_features=2, out_features=16)
  (linear2): Linear(in_features=16, out_features=16)
  (linear3): Linear(in_features=16, out_features=3)
  (sequential_module): Sequential(
    (0): Linear(in_features=16, out_features=16)
    (1): Sigmoid()
    (2): Dropout(p=0.2)
  )
  (relu1): Sigmoid()
  (relu2): Sigmoid()
)
>>> torch.__version__
'0.3.0b0+591e73e'

Best.

K. Frank