Yes, I guess you could call DenseNetConv
an untrained feature extractor.
Yes. You’ve frozen all trainable parameters from the feature extractor by setting their requires_grad
attribute to False
, while the linear layers in MyDenseNetDens
should still be trainable since no attributes were changed. Calling .train()
on the model will normalize the forward activations using the batch stats and will update the running stats.