This question may sound silly. Is it possible to take a few layers from a built-in model for example resnet50?
For example, I only want to pull out the first 2 layers from resnet50. Is it a possible way to make that happen?
This question may sound silly. Is it possible to take a few layers from a built-in model for example resnet50?
For example, I only want to pull out the first 2 layers from resnet50. Is it a possible way to make that happen?
Yes, that’s possible and you could e.g. directly reuse them as seen here:
model = models.resnet50()
conv1 = model.conv1
bn1 = model.bn1
x = torch.randn(1, 3, 224, 224)
out = conv1(x)
out = bn1(out)
print(out.shape)
> torch.Size([1, 64, 112, 112])
Depending on your use case you might want to assign these layers as internal modules to a custom nn.Module
.
Thank you. What if I reference a few layers from resnet50 and make these withdrew layers as an independent neural network?
Could you explain what “make these withdrew layers” means in this context?
If I withdraw a few layers from resnet50. Then, I wish I can use these withdrew layers and combine them into a new neural network.
Well, I wish I can make the following layers as an independent neural network. Like the following
class new_net(nn.Module):
model.conv1
model.bn1
When the new neural network is created. Then, I can use the new neural network like: model = new_net().
Yes, this will also work:
class NewNet(nn.Module):
def __init__(self):
super().__init__()
resnet = models.resnet50()
self.conv1 = resnet.conv1
self.bn1 = resnet.bn1
def forward(self, x):
x = self.conv1(x)
x = self.bn1(x)
return x
model = NewNet()
x = torch.randn(1, 3, 224, 224)
out = model(x)