keras CNN中的衰减学习率

klsxnrf1  于 2023-10-19  发布在  其他
关注(0)|答案(1)|浏览(94)

我想用Adam optimizer实现一个CNN,它有一个学习率衰减的回调函数(使用keras/tensorflow):

class CustomCallback(tf.keras.callbacks.Callback):
    def on_epoch_begin(self, epoch, logs=None):
        current_decayed_lr = self.model.optimizer._decayed_lr(tf.float32).numpy()
        print("current decayed lr: {:0.7f}".format(current_decayed_lr))

但我得到以下错误消息:

current_decayed_lr = self.model.optimizer._decayed_lr(tf.float32).numpy()

AttributeError: 'Adam' object has no attribute '_decayed_lr'

如何解决这一问题?

nc1teljy

nc1teljy1#

在keras optimizer API的修订版中,decay已被弃用。但是前面的优化器放在遗留名称空间下。您可以尝试从那里,但请注意,这将是很好的采用新的API为未来的情况。

import numpy as np
import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers
from tensorflow.keras.optimizers.legacy import Adam

inp = keras.Input((2,))
out = layers.Dense(2)(inp)
model = keras.Model(inp, out)
model.compile(Adam(0.01, decay=0.001), loss='mse')

class CustomCallback(keras.callbacks.Callback):
    def on_epoch_begin(self, epoch, logs=None):
        current_decayed_lr = self.model.optimizer._decayed_lr(tf.float32).numpy()
        print("current decayed lr: {:0.7f}".format(current_decayed_lr))

x = y = np.random.randn(32, 2) 
model.fit(x, y, callbacks=[CustomCallback()], epochs=10)
current decayed lr: 0.0100000
Epoch 1/10
1/1 [==============================] - 1s 661ms/step - loss: 2.6350
current decayed lr: 0.0099900
Epoch 2/10
1/1 [==============================] - 0s 5ms/step - loss: 2.5868
current decayed lr: 0.0099800
Epoch 3/10
1/1 [==============================] - 0s 8ms/step - loss: 2.5394
current decayed lr: 0.0099701
Epoch 4/10
1/1 [==============================] - 0s 8ms/step - loss: 2.4927
current decayed lr: 0.0099602

相关问题