我很难将任何回调应用到Keras Tuner hyperparameter optimsier对象。下面是我运行的代码:
from keras.callbacks import TensorBoard, EarlyStopping
%load_ext tensorboard
BATCH_SIZE = 32
time_stamp = time.time()
tensorboard = TensorBoard(log_dir = " graphs/{}".format(time_stamp))
checkpoint = ModelCheckpoint(filepath = r"D:\Uni work\...\CNN.hdf5" , monitor = 'val_accuracy', verbose = 1, save_best_only = True )
early_stopping = EarlyStopping( monitor="val_loss" , patience= 3, verbose=2)
tuner = BayesianOptimization(build_model, objective = "val_accuracy", max_trials = 30, num_initial_points=2, project_name ="audio_classifier")
tuner.search(x = train_X, y=y_cat_encoded, epochs=35, callbacks = early_stopping, batch_size = BATCH_SIZE, validation_data = (validation_X, y_validation_cat_encoded))
虽然我想应用tensorboard和checkpoint回调,但它只是通过传递提前停止回调而失败。我得到以下错误:
C:\Anaconda\envs\test\lib\site-packages\kerastuner\engine\tuner.py in _deepcopy_callbacks(self, callbacks)
277 callbacks = copy.deepcopy(callbacks)
278 except:
--> 279 raise ValueError(
280 'All callbacks used during a search '
281 'should be deep-copyable (since they are '
ValueError: All callbacks used during a search should be deep-copyable (since they are reused across trials). It is not possible to do `copy.deepcopy(<tensorflow.python.keras.callbacks.EarlyStopping object at 0x000001802D138100>)
我不太熟悉deep-copyable这个术语,也不知道它在错误代码方面意味着什么。有人知道如何解决这个问题吗?
1条答案
按热度按时间waxmsbnn1#
我迟到了,但也许有人会需要这个答案:
在我的例子中,错误意味着回调的变量应该在模型构建函数之外定义,以便
search
可以访问它们。在你的特殊情况下,我认为可能有两个可能的原因:
1.回调应该以列表的形式给出-即使只有一个:
callbacks = [early_stopping]
1.代码格式不符合PEP 8:https://peps.python.org/pep-0008/