scipy 如何在不重新计算目标函数的情况下显示差分进化的进程

k4aesqcs  于 2023-06-23  发布在  其他
关注(0)|答案(1)|浏览(132)

我想显示differential evolution的进度,并在运行时存储目标函数值。我的MWE是:

def de_optimise():
    def build_show_de(MIN=None):
        if MIN is None:
            MIN = [0]
        def fn(xk, convergence):
            obj_val = opt(xk)
            if  obj_val < MIN[-1]:
                print("DE", [round(x, 2) for x in xk], obj_val)
                MIN.append(opt(xk))
        return fn

    bounds = [(0,1)]*3
    # Define the linear constraints
    A = [[-1, 1, 0], [0, -1, 1]]
    lb = [0.3, 0.4]
    ub = [np.inf, np.inf]
    constraints = LinearConstraint(A, lb, ub)
    progress_f = [0]
    c = build_show_de(progress_f)
    print("Optimizing using differential evolution")

    res = differential_evolution(
        opt, 
        bounds=bounds,
        constraints=constraints,
        callback=c, 
        disp=True
    )
    print(f"external way of keeping track of MINF: {progress_f}")

de_optimise()

它可以工作,但在函数fn中,我必须重新计算opt(xk),它必须已经计算过了。我必须这样做,因为differential_evolution的回调函数被记录在as follows中:

callback:可调用,callback(xk,convergence=瓦尔),可选跟踪最小化进程的函数。xk是迄今为止找到的最佳解决方案。瓦尔表示总体收敛的分数值。当瓦尔大于1时,函数停止。如果callback返回True,则停止最小化(仍执行任何抛光)。

由于这是缓慢的,它减慢了很多优化。我怎样才能避免不得不这样做?

e4yzc0pl

e4yzc0pl1#

如果我理解正确的话,你想要的是这样的东西:

from scipy.optimize import differential_evolution, LinearConstraint, rosen
import numpy as np

class fn:
    def __init__(self):
        self.best_x = None
        self.minf = np.inf
    
    def __call__(self, x):
        f = rosen(x)
        
        if f < self.minf:
            self.minf = f
            self.best_x = x

        return f

    
class callback:
    def __init__(self, FN):
        self.FN = FN
    
    def __call__(self, xk, convergence):
        np.testing.assert_equal(xk, self.FN.best_x)
        print(self.FN.best_x, self.FN.minf)

FN = fn()
C = callback(FN)

res = differential_evolution(FN, [(0, 10)] * 5, callback=C)
print(res)
[1.73780654 1.5180404  2.07430624 1.72280576 5.55451018] 1567.9474614862615
[1.20847265 0.90208029 0.27028852 2.46391859 6.70219186] 674.9123192084
...
[1. 1. 1. 1. 1.] 0.0
 message: Optimization terminated successfully.
 success: True
     fun: 0.0
       x: [ 1.000e+00  1.000e+00  1.000e+00  1.000e+00  1.000e+00]
     nit: 591
    nfev: 44406

相关问题