Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
379 views
in Technique[技术] by (71.8m points)

python - How can I use tf.keras.Model.summary to see the layers of a child model which in a father model?

I have a subclass Model of tf.keras.Model,code is following

import tensorflow as tf


class Mymodel(tf.keras.Model):

    def __init__(self, classes, backbone_model, *args, **kwargs):
        super(Mymodel, self).__init__(self, args, kwargs)
        self.backbone = backbone_model
        self.classify_layer = tf.keras.layers.Dense(classes,activation='sigmoid')

    def call(self, inputs):
        x = self.backbone(inputs)
        x = self.classify_layer(x)
        return x

inputs = tf.keras.Input(shape=(224, 224, 3))
model = Mymodel(inputs=inputs, classes=61, 
                backbone_model=tf.keras.applications.MobileNet())
model.build(input_shape=(20, 224, 224, 3))
model.summary()

the result is :

_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
mobilenet_1.00_224 (Model)   (None, 1000)              4253864   
_________________________________________________________________
dense (Dense)                multiple                  61061     
=================================================================
Total params: 4,314,925
Trainable params: 4,293,037
Non-trainable params: 21,888
_________________________________________________________________

but I want to see the all layers of mobilenet,then I tried to extract all layers of mobilenet and put in the model:

import tensorflow as tf


class Mymodel(tf.keras.Model):

    def __init__(self, classes, backbone_model, *args, **kwargs):
        super(Mymodel, self).__init__(self, args, kwargs)
        self.backbone = backbone_model
        self.classify_layer = tf.keras.layers.Dense(classes,activation='sigmoid')

    def my_process_layers(self,inputs):
        layers = self.backbone.layers
        tmp_x = inputs
        for i in range(1,len(layers)):
            tmp_x = layers[i](tmp_x)
        return tmp_x

    def call(self, inputs):
        x = self.my_process_layers(inputs)
        x = self.classify_layer(x)
        return x

inputs = tf.keras.Input(shape=(224, 224, 3))
model = Mymodel(inputs=inputs, classes=61, 
                backbone_model=tf.keras.applications.MobileNet())
model.build(input_shape=(20, 224, 224, 3))
model.summary()

then the resule not changed.

    _________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
mobilenet_1.00_224 (Model)   (None, 1000)              4253864   
_________________________________________________________________
dense (Dense)                multiple                  61061     
=================================================================
Total params: 4,314,925
Trainable params: 4,293,037
Non-trainable params: 21,888
_________________________________________________________________

then I tried to extract one layer insert to the model :

import tensorflow as tf


class Mymodel(tf.keras.Model):

    def __init__(self, classes, backbone_model, *args, **kwargs):
        super(Mymodel, self).__init__(self, args, kwargs)
        self.backbone = backbone_model
        self.classify_layer = tf.keras.layers.Dense(classes,activation='sigmoid')

    def call(self, inputs):
        x = self.backbone.layers[1](inputs)
        x = self.classify_layer(x)
        return x

inputs = tf.keras.Input(shape=(224, 224, 3))
model = Mymodel(inputs=inputs, classes=61, 
                backbone_model=tf.keras.applications.MobileNet())
model.build(input_shape=(20, 224, 224, 3))
model.summary()

It did not change either.I am so confused.

_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
mobilenet_1.00_224 (Model)   (None, 1000)              4253864   
_________________________________________________________________
dense (Dense)                multiple                  244       
=================================================================
Total params: 4,254,108
Trainable params: 4,232,220
Non-trainable params: 21,888
_________________________________________________________________

but I find that the parameter of dense layer changed,I dont know what happend.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

@Ioannis 's answer is perfectly fine, but unfortunately it drops the keras 'Model Subclassing' structure that is present in the question. If, just like me, you want to keep this model subclassing and still show all layers in the summary, you can branch down into all the individual layers of the more complex model using a for loop:

class MyMobileNet(tf.keras.Sequential):
    def __init__(self, input_shape=(224, 224, 3), classes=61):
        super(MyMobileNet, self).__init__()
        self.backbone_model = [layer for layer in
               tf.keras.applications.MobileNet(input_shape, include_top=False, pooling='avg').layers]
        self.classificator = tf.keras.layers.Dense(classes,activation='sigmoid', name='classificator')

    def call(self, inputs):
        x = inputs
        for layer in self.backbone_model:
            x = layer(x)
        x = self.classificator(x)
        return x
model = MyMobileNet()

After this we can directly build the model and call the summary:

model.build(input_shape=(None, 224, 224, 3))
model.summary()

>
Model: "my_mobile_net"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv1_pad (ZeroPadding2D)    (None, 225, 225, 3)       0         
_________________________________________________________________
conv1 (Conv2D)               (None, 112, 112, 32)      864       
_________________________________________________________________
conv1_bn (BatchNormalization (None, 112, 112, 32)      128       
_________________________________________________________________
....
....
conv_pw_13 (Conv2D)          (None, 7, 7, 1024)        1048576   
_________________________________________________________________
conv_pw_13_bn (BatchNormaliz (None, 7, 7, 1024)        4096      
_________________________________________________________________
conv_pw_13_relu (ReLU)       (None, 7, 7, 1024)        0         
_________________________________________________________________
global_average_pooling2d_13  (None, 1024)              0         
_________________________________________________________________
classificator (Dense)        multiple                  62525     
=================================================================
Total params: 3,291,389
Trainable params: 3,269,501
Non-trainable params: 21,888
_________________________________________________________________

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...