不兼容形状均方误差Keras

7gyucuyw  于 2022-12-04  发布在  其他
关注(0)|答案(1)|浏览(159)

我想用Keras训练一个RNN,X的形状是(4413, 71, 19),而y的形状是(4413,2)
编码

model = Sequential()
model.add(LSTM(128, return_sequences=True, input_shape=(None,19)))
model.add(Dropout(.2))
model.add(BatchNormalization())

model.add(LSTM(128, return_sequences=True, input_shape=(None,19)))
model.add(Dropout(.2))
model.add(BatchNormalization())

model.add(LSTM(128, return_sequences=True, input_shape=(None,19)))
model.add(Dropout(.2))
model.add(BatchNormalization())

model.add(Dense(32, activation='relu'))
model.add(Dropout(.2))

model.add(Dense(2, activation='softmax'))

model.compile(optimizer='adam', loss='mean_squared_error', metrics=['mean_squared_error'])

当我拟合模型时,我得到了这个错误,似乎损失函数不能拟合这种数据

Incompatible shapes: [64,2] vs. [64,71,2]
     [[{{node mean_squared_error/SquaredDifference}}]] [Op:__inference_train_function_157671]
szqfcxe2

szqfcxe21#

尝试将最后一个LSTM图层的参数return_sequences设置为False

model = Sequential()
model.add(LSTM(128, return_sequences=True, input_shape=(None,19)))
model.add(Dropout(.2))
model.add(BatchNormalization())

model.add(LSTM(128, return_sequences=True))
model.add(Dropout(.2))
model.add(BatchNormalization())

model.add(LSTM(128, return_sequences=False))
model.add(Dropout(.2))
model.add(BatchNormalization())

model.add(Dense(32, activation='relu'))
model.add(Dropout(.2))

model.add(Dense(2, activation='linear'))

model.compile(optimizer='adam', loss='mean_squared_error', metrics=['mean_squared_error'])

我还将输出层中的激活函数更改为linear,因为softmax层在您的情况下没有多大意义。

相关问题