keras 连接两个模型时的不连通图

tktrz96b  于 2023-11-19  发布在  其他
关注(0)|答案(1)|浏览(143)

尝试基于部分预训练模型构建新模型,
这里有一些清理出来的代码。
让我们假设我们已经训练了model1,并且想要添加一些在model2中定义的层:

import tensorflow.keras as keras
import tensorflow as tf
from tensorflow.keras.layers import Input, Conv2D, Activation
from tensorflow.keras.models import Model, Sequential

model1 = Sequential([
    Conv2D(2, (3,3), padding='same', input_shape=(6,6,1)),
    Activation('relu')
])
model2 = Sequential([
    Conv2D(3, (3,3), padding='same', input_shape=(6,6,2)),
    Activation('softmax')
])

model_merge = Model(inputs=model1.input, 
                    outputs=Activation('softmax')(model2(model1.get_layer('conv2d').output)))

字符串
它看起来有点乱,但我想通过在这里添加softmax激活来演示它没有断开连接。
模型1总结:

_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d (Conv2D)              (None, 6, 6, 2)           20        
_________________________________________________________________
activation (Activation)      (None, 6, 6, 2)           0         
=================================================================
Total params: 20
Trainable params: 20
Non-trainable params: 0
_________________________________________________________________


模型2概要:

_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_4 (Conv2D)            (None, 6, 6, 3)           57        
_________________________________________________________________
activation_4 (Activation)    (None, 6, 6, 3)           0         
=================================================================
Total params: 57
Trainable params: 57
Non-trainable params: 0
_________________________


以及model_merge的摘要:

_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_input (InputLayer)    (None, 6, 6, 1)           0         
_________________________________________________________________
conv2d (Conv2D)              (None, 6, 6, 2)           20        
_________________________________________________________________
sequential_2 (Sequential)    (None, 6, 6, 3)           57        
_________________________________________________________________
activation_4 (Activation)    (None, 6, 6, 3)           0         
=================================================================
Total params: 77
Trainable params: 77
Non-trainable params: 0
_________________________________________________________________


让我们证明这个合并模型不是断开的:

layers = [layer.output for layer in model_merge.layers]
test1 = Model(inputs=model_merge.input, outputs=layers[-1])


一切正常。
test1的总结:

_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_input (InputLayer)    (None, 6, 6, 1)           0         
_________________________________________________________________
conv2d (Conv2D)              (None, 6, 6, 2)           20        
_________________________________________________________________
sequential_2 (Sequential)    (None, 6, 6, 3)           57        
_________________________________________________________________
activation_4 (Activation)    (None, 6, 6, 3)           0         
=================================================================
Total params: 77
Trainable params: 77
Non-trainable params: 0
_________________________________________________________________


悲剧是这样的:

test2 = Model(inputs=model_merge.input, outputs=layers[-2])


最重要的反馈:

ValueError: Graph disconnected: cannot obtain value for tensor Tensor("conv2d_2_input:0", shape=(?, 6, 6, 2), dtype=float32) at layer "conv2d_2_input". The following previous layers were accessed without issue: []


完整反馈:

ValueErrorTraceback (most recent call last)
<ipython-input-18-946b325081c1> in <module>
----> 1 test = Model(inputs=model_merge.input, outputs=layers[-2])

/usr/local/lib/python3.5/dist-packages/tensorflow/python/keras/engine/training.py in __init__(self, *args, **kwargs)
    119 
    120   def __init__(self, *args, **kwargs):
--> 121     super(Model, self).__init__(*args, **kwargs)
    122     # Create a cache for iterator get_next op.
    123     self._iterator_get_next = weakref.WeakKeyDictionary()

/usr/local/lib/python3.5/dist-packages/tensorflow/python/keras/engine/network.py in __init__(self, *args, **kwargs)
     79         'inputs' in kwargs and 'outputs' in kwargs):
     80       # Graph network
---> 81       self._init_graph_network(*args, **kwargs)
     82     else:
     83       # Subclassed network

/usr/local/lib/python3.5/dist-packages/tensorflow/python/training/checkpointable/base.py in _method_wrapper(self, *args, **kwargs)
    440     self._setattr_tracking = False  # pylint: disable=protected-access
    441     try:
--> 442       method(self, *args, **kwargs)
    443     finally:
    444       self._setattr_tracking = previous_value  # pylint: disable=protected-access

/usr/local/lib/python3.5/dist-packages/tensorflow/python/keras/engine/network.py in _init_graph_network(self, inputs, outputs, name)
    219     # Keep track of the network's nodes and layers.
    220     nodes, nodes_by_depth, layers, layers_by_depth = _map_graph_network(
--> 221         self.inputs, self.outputs)
    222     self._network_nodes = nodes
    223     self._nodes_by_depth = nodes_by_depth

/usr/local/lib/python3.5/dist-packages/tensorflow/python/keras/engine/network.py in _map_graph_network(inputs, outputs)
   1850                              'The following previous layers '
   1851                              'were accessed without issue: ' +
-> 1852                              str(layers_with_complete_input))
   1853         for x in node.output_tensors:
   1854           computable_tensors.append(x)

ValueError: Graph disconnected: cannot obtain value for tensor Tensor("conv2d_2_input:0", shape=(?, 6, 6, 2), dtype=float32) at layer "conv2d_2_input". The following previous layers were accessed without issue: []


我都快疯了,
有什么想法吗?

ygya80vv

ygya80vv1#

您尝试用作输出的层有两个输出节点。第一个将model2的输入连接到model2的输出。第二个输出节点将model1的输出连接到model2的第一层。默认情况下,层输出只返回第一个输出节点。所以发生的事情是你把model_merge的输入(model1的输入)和第一个输出节点连接起来。
下面的代码显示了这一点。层的各个输出节点可以使用层的get_output_at()方法访问。

layer_output = model_merge.layers[-2].output # The first output node
layer_output_1 = model_merge.layers[-2].get_output_at(0) # The first output node
layer_output_2 = model_merge.layers[-2].get_output_at(1) # The second output node

字符串
现在,下面的两个代码抛出错误,因为图是断开的。

test2 = Model(inputs=model_merge.input, outputs=layer_output)


test2 = Model(inputs=model_merge.input, outputs=layer_output_1)


但是下面的代码不会抛出错误,因为图是连通的。

test2 = Model(inputs=model_merge.input, outputs=layer_output_2)

相关问题