keras 机器学习模型中的层解释

azpvetkf  于 2023-06-30  发布在  其他
关注(0)|答案(1)|浏览(135)

我已经打印出了我的模型中的层数,代码工作正常,我只是想了解输出的实际含义,输出如下。

Number of layers: 30
Layer types:
input_1 - InputLayer
conv2d - Conv2D
batch_normalization - BatchNormalization
activation - Activation
conv2d_1 - Conv2D
batch_normalization_1 - BatchNormalization
activation_1 - Activation
conv2d_2 - Conv2D
batch_normalization_2 - BatchNormalization
add - Add
activation_2 - Activation
conv2d_3 - Conv2D
batch_normalization_3 - BatchNormalization
activation_3 - Activation
conv2d_4 - Conv2D
conv2d_5 - Conv2D
batch_normalization_4 - BatchNormalization
add_1 - Add
activation_4 - Activation
conv2d_6 - Conv2D
batch_normalization_5 - BatchNormalization
activation_5 - Activation
conv2d_7 - Conv2D
conv2d_8 - Conv2D
batch_normalization_6 - BatchNormalization
add_2 - Add
activation_6 - Activation
average_pooling2d - AveragePooling2D
flatten - Flatten
dense - Dense

这是否意味着我的模型有6层,每层有5个隐藏层?权重矩阵是否仅针对作为整体的六个层构造?或者当我运行模型的时候,权重矩阵会被构造为30层吗?我应该如何解释上面的输出。
我使用了keras的model.summary()-这是我的输出:

__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_1 (InputLayer)            [(None, 32, 32, 3)]  0                                            
__________________________________________________________________________________________________
conv2d (Conv2D)                 (None, 32, 32, 16)   448         input_1[0][0]                    
__________________________________________________________________________________________________
batch_normalization (BatchNorma (None, 32, 32, 16)   64          conv2d[0][0]                     
__________________________________________________________________________________________________
activation (Activation)         (None, 32, 32, 16)   0           batch_normalization[0][0]        
__________________________________________________________________________________________________
conv2d_1 (Conv2D)               (None, 32, 32, 16)   2320        activation[0][0]                 
__________________________________________________________________________________________________
batch_normalization_1 (BatchNor (None, 32, 32, 16)   64          conv2d_1[0][0]                   
__________________________________________________________________________________________________
activation_1 (Activation)       (None, 32, 32, 16)   0           batch_normalization_1[0][0]      
__________________________________________________________________________________________________

conv2d_2 (Conv2D)               (None, 32, 32, 16)   2320        activation_1[0][0]               
    __________________________________________________________________________________________________
    batch_normalization_2 (BatchNor (None, 32, 32, 16)   64          conv2d_2[0][0]                   
    __________________________________________________________________________________________________
    add (Add)                       (None, 32, 32, 16)   0           activation[0][0]                 
                                                                     batch_normalization_2[0][0]      
    __________________________________________________________________________________________________
    activation_2 (Activation)       (None, 32, 32, 16)   0           add[0][0]                        
    __________________________________________________________________________________________________
    conv2d_3 (Conv2D)               (None, 16, 16, 32)   4640        activation_2[0][0]               
    __________________________________________________________________________________________________
    batch_normalization_3 (BatchNor (None, 16, 16, 32)   128         conv2d_3[0][0]                   
    __________________________________________________________________________________________________
    activation_3 (Activation)       (None, 16, 16, 32)   0           batch_normalization_3[0][0]      
    __________________________________________________________________________________________________
    conv2d_4 (Conv2D)               (None, 16, 16, 32)   9248        activation_3[0][0]               
    __________________________________________________________________________________________________
    conv2d_5 (Conv2D)               (None, 16, 16, 32)   544         activation_2[0][0]               
    __________________________________________________________________________________________________
    batch_normalization_4 (BatchNor (None, 16, 16, 32)   128         conv2d_4[0][0]                   
    __________________________________________________________________________________________________
    add_1 (Add)                     (None, 16, 16, 32)   0           conv2d_5[0][0]                   
                                                                     batch_normalization_4[0][0]      
    __________________________________________________________________________________________________
    activation_4 (Activation)       (None, 16, 16, 32)   0           add_1[0][0]                      
    __________________________________________________________________________________________________
    conv2d_6 (Conv2D)               (None, 8, 8, 64)     18496       activation_4[0][0]               
    __________________________________________________________________________________________________
to94eoyn

to94eoyn1#

在tensorflow.keras中,层的定义与您可能从文献中期望的不同。层是tensorflow. keras. layers. Layer的子类。
当构建模型时,你从一个没有参数(权重)的输入层开始。然后你有一个Conv 2d层,它有参数,然后是BatchNormalization层,它只是执行标准化-它没有任何参数,类似地激活层也没有任何参数。
在你构建的模型中,只有Dense和Conv 2D层有参数,其余的只是执行不可训练的数学运算。
你可以使用model.summary()来打印层,它会显示哪些层对参数计数有贡献,并显示网络连接-这是一种非常有用的调试模型的方法。

相关问题