I am trying to reconstruct the architecture of a model that uses inceptionV3 pretrained weights, I read the info from the “tflite” file of the said model.
interpreter = tf.lite.Interpreter(model_path="model_predict.tflite")
all_tensor_details = interpreter.get_tensor_details()
interpreter.allocate_tensors()
for tensor_item in all_tensor_details:
print("Weight %s:" % tensor_item["name"])
It is using inceptionV3 weights, which sound like:
Weight InceptionV3/Conv2d_4a_3x3/BatchNorm/FusedBatchNorm:
Weight InceptionV3/Mixed_5b/Branch_0/Conv2d_0a_1x1/BatchNorm/FusedBatchNorm:
Am I loading these weights correctly here?:
class Predict(nn.Module):
def __init__(self, inception):
super(Predict, self).__init__()
...
model += [inception.Conv2d_4a_3x3]
model += [tf.nn.batch_normalization(name=None)]
model += [tf.compat.v1.nn.fused_batch_norm(mean=None, variance=None, epsilon=0.001,
data_format='NHWC', is_training=True, name=None,
exponential_avg_factor=1.0)]
model += [inception.Mixed_5b]
...
inception = torchvision.models.inception_v3(pretrained=True)
my_pred = Predict(inception)
By that, I mean, was it right to include the batch normalization and fused batch normalization functions, or are they part of inception.Conv2d_4a_3x3 already? I have never used inceptionV3 before, and I only found this post that helped me a bit:
but there is no explanation about the batch normalization. Plus, for the 2nd weight array, inceptionV3 does not contain anything called “Branch_0” separately, so I am assuming everything, including the batch norms, are part of the same weight. That I want to make sure of.