pandas 在LSTN的最后一个致密层中使用simoid激活函数时出错

k7fdbhmy  于 2022-11-20  发布在  其他
关注(0)|答案(1)|浏览(130)

尝试使用S形作为LSTN最后一个致密层的激活函数时,出现以下错误

ValueError: `logits` and `labels` must have the same shape, received ((None, 60, 1) vs (None,)).

密码是这样的

scaler = StandardScaler()
X_train_s = scaler.fit_transform(X_train) #scaled_train 
X_test_s = scaler.transform(X_test) #scaled_test     

length = 60
n_features=89

generator = TimeseriesGenerator(X_train_s, Y_train['TARGET_ENTRY_LONG'], length=length, batch_size=1)
validation_generator = TimeseriesGenerator(X_test_s, Y_test['TARGET_ENTRY_LONG'], length=length, batch_size=1)

# define model
model = Sequential()
model.add(LSTM(90, activation='relu', input_shape=(length, n_features), return_sequences=True, dropout = 0.3))
model.add(LSTM(30,activation='relu',return_sequences=True, dropout = 0.3))
model.add(Dense(1, activation = 'sigmoid'))
model.compile(optimizer='adam', loss='binary_crossentropy')

model.summary()

# fit model

model.fit(generator,epochs=3,
                    validation_data=validation_generator)
                   #callbacks=[early_stop])

如果我用下面的图层声明替换最后一个图层声明

model.add(Dense(1))

我没有得到错误,但可能也不是预期的结果。有什么想法吗?

nwnhqdif

nwnhqdif1#

几次尝试后发现了故障原因,正如史努比博士在之前的一段话中所说,它就在最后一层之前:如果最后一个层是使用sigmoid作为激活函数的二进制分类的密集层,则不应设置“return_sequences=True”,即针对之前的所有层。因此,该层

model.add(LSTM(30,activation='relu',return_sequences=True, dropout = 0.3))

应改为如下所示

model.add(LSTM(30,activation='relu', dropout = 0.3))

相关问题