我想在keras中的剩余块之间添加一个skip连接。这是我目前的实现,它不起作用,因为Tensor有不同的形状。
函数如下所示:
def build_res_blocks(net, x_in, num_res_blocks, res_block, num_filters, res_block_expansion, kernel_size, scaling):
net_next_in = net
for i in range(num_res_blocks):
net = res_block(net_next_in, num_filters, res_block_expansion, kernel_size, scaling)
# net tensor shape: (None, None, 32)
# x_in tensor shape: (None, None, 3)
# Error here, net_next_in should be in the shape of (None, None, 32) to be fed into next layer
net_next_in = Add()([net, x_in])
return net
但我得到
ValueError: Operands could not be broadcast together with shapes (None, None, 32) (None, None, 3)
如何添加或合并这些Tensor到正确的形状(无,无,32)?如果这是不正确的方法,你怎么能达到预期的结果?res_block
如下所示:
def res_block(x_in, num_filters, expansion, kernel_size, scaling):
x = Conv2D(num_filters * expansion, kernel_size, padding='same')(x_in)
x = Activation('relu')(x)
x = Conv2D(num_filters, kernel_size, padding='same')(x)
x = Add()([x_in, x])
return x
1条答案
按热度按时间dauxcl2d1#
你不能添加不同形状的Tensor。你可以用keras.layers.Concatenate连接它们,但这会留下一个形状为
[None, None, 35]
的Tensor。或者,看看Keras中的Resnet50实现。他们的残差块在快捷方式中提供了1x1xC卷积,适用于要添加的维度不同的情况。