keras_tuner在时间序列预测中不起作用

biswetbf  于 2023-10-19  发布在  其他
关注(0)|答案(1)|浏览(112)

我尝试使用keras_tuner调优gru模型的超参数,但没有成功。当我在没有验证数据的情况下运行代码时,它可以正常工作:tuner.search(x_train,epochs=5,validation_data=x_val)。但是当我检查它时,我得到的“最优”超参数显示了一个可笑的欠拟合,尽管在运行keras_tuner时损失值非常好。如何正确调整参数?
'''

def windowed_dataset_test(series, window_size, batch_size):
    dataset = tf.data.Dataset.from_tensor_slices(series)
    dataset = dataset.window(window_size + 1, shift=1, drop_remainder=True)
    dataset = dataset.flat_map(lambda window: window.batch(window_size + 1))
    dataset = dataset.map(lambda window: (window[:-1], window[-1]))
    dataset = dataset.batch(batch_size).prefetch(1)
    return dataset
class MyHyperModel(HyperModel):
    def build(self, hp):
        strategy = tf.distribute.MirroredStrategy()
        window_size = hp.Int('window_size', min_value=5, max_value=60, step=3)
        with strategy.scope():
            model = tf.keras.models.Sequential()
            for i in range(hp.Int("num_layers", min_value=1, max_value=5, step=1)):  # tuning the num of layers
                model.add(tf.keras.layers.GRU(
                    units=hp.Int("units", min_value=10, max_value=500, step=10),  # tuning the num of units
                    input_shape=[window_size, 1],  # use self.window_size here
                    return_sequences=True,
                ))
            model.add(tf.keras.layers.Dense(1))
            learning_rate = hp.Float("lr", min_value=1e-8, max_value=1e-2, sampling="log")  # tuning the learning late
            model.compile(
                optimizer=tf.optimizers.SGD(learning_rate=learning_rate, momentum=hp.Float('momentum', min_value=0, max_value=0.9, step=0.1)),
                # tuning the momentum
                loss="mse",
                metrics="mse")
        return model

    def fit(self, hp: object, model: object, x: object, **kwargs: object) -> object:
        batch_size = hp.Choice('batch_size', [8, 16, 32, 64, 128])
        x = windowed_dataset_test(x_train, hp.get('window_size'), batch_size)

        return model.fit(x, **kwargs)

tuner = keras_tuner.BayesianOptimization(
    MyHyperModel(),
    objective=Objective('val_loss', direction='min'),
    max_trials=50,
    overwrite=True,
    directory="hyperprameter_tuning",
    project_name="GRU_only_price_baysian",
)

series_price = df['nationwide_price']
x_train = series_price['2010-03-13':'2021-12-31'].to_numpy()
x_test = series_price['2022-01-01':].to_numpy()
val_split = int(len(x_train) * 0.9)  # or any other split ratio
x_val = x_train[val_split:]
x_train = x_train[:val_split]

tuner.search(x_train, epochs=5, validation_data=x_val)
with tf.device('/device:GPU:0'):
    tuner.search(x_train, epochs=3)

hypermodel = MyHyperModel()
best_hp = tuner.get_best_hyperparameters()[0:10]
print(best_hp)

ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all()

'''

ujv3wf0j

ujv3wf0j1#

这通常发生在你使用一个需要一个布尔值的数组时。
在您的代码中,错误可能来自以下行:

tuner.search(x_train, epochs=5, validation_data=x_val)

若要修复此错误,应按如下方式修改代码:

  • 检查x_train和x_瓦尔的数据类型和结构,确保它们是NumPy数组。
  • 确保tuner.search()的validation_data参数为

格式正确。它应该是包含验证的元组
数据和标签(如适用)。

  • 确保x_train和x_瓦尔的形状与

模型的输入要求。他们应该有相同的
特征数量和兼容尺寸。
假设x_train和x_瓦尔是NumPy数组,确保它们具有兼容的形状。

tuner.search(x=x_train, epochs=5, validation_data=(x_val, y_val))

在上面的代码中,如果有验证标签的话,用它们替换y_瓦尔。通过确保数据格式正确并以元组形式提供验证数据,您应该能够解决“数组的真值”错误。

相关问题