使用共轭梯度在Python中实现非精确(近似)牛顿算法

xxe27gdn  于 2023-02-21  发布在  Python
关注(0)|答案(1)|浏览(163)

我想用Python实现一个伪代码给出的不精确(近似)牛顿算法:
数据:我们有一个x_0 \in R^n,我们设一个函数f:R^n->R,正黑森,0<c1<10<\rho<1
结果:结果是估计f的全局最小值x_{k+1}
以下步骤:对于k = 0 ...直到停止条件do:

\eta_k=1/2 min{1/2,\sqrt(∥∇f(x_k)∥)}

\epsilon_k=\eta_k∥∇f(x_k)∥

p_k=conjugate_gradient(∇^2f(x_k),∇f(x_k),\epsilon_k)

\alpha_k=backtrack(x_k,p_k,1.0,c_1,\rho)

x_{k+1}=x_k+\alpha_kp_k

如果很难看到算法,那么我已经把它上传到了我之前的问题中:用Python实现不精确牛顿算法
我已经做了共轭梯度算法如下:

def conjugate_grad(A,b, maxiter = 100000):
   n= A.shape[0]
   x=np.zeros(n)
   r=b-A@x
   p=r.copy()
   r_old=np.inner(r,r)
   for it in range(maxiter):
       alpha=r_old/np.inner(p,A@p)
       x+=alpha*p
       r-=alpha*A@p
       r_new=np.inner(r,r)
       if np.sqrt(r_new)<1e-100:
           break
       beta=r_new/r_old
       p=r+beta*p
       r_old=r_new.copy()
   return x

但是我如何在Python中实现不精确(近似)牛顿算法,我如何处理最小值函数,以及我如何用回溯法正确地构建它,有人能帮助我吗?

tez616oj

tez616oj1#

下面是一个在Python中使用共轭梯度和回溯线搜索函数实现的不精确(近似)牛顿算法

import numpy as np
from scipy.linalg import solve

def backtracking_line_search(f, grad_f, x, d, alpha=1.0, rho=0.5, c=1e-4):
    """
    Backtracking line search to find a step size that satisfies the Armijo condition.
    f: the function to be minimized
    grad_f: the gradient of f
    x: the current iterate
    d: the search direction
    alpha: the initial step size
    rho: the backtracking parameter (0 < rho < 1)
    c: the Armijo condition parameter (0 < c < 1)
    """
    while f(x + alpha*d) > f(x) + c*alpha*np.dot(grad_f(x), d):
        alpha *= rho
    return alpha

def newton_cg(f, x0, grad_f, hess_f, tol=1e-6, max_iter=100):
    """
    Inexact Newton method with conjugate gradient for solving f(x) = 0.
    f: the function to be solved
    x0: the initial guess
    grad_f: the gradient of f
    hess_f: the Hessian matrix of f
    tol: the tolerance for convergence
    max_iter: the maximum number of iterations
    """
    x = x0
    for i in range(max_iter):
        g = grad_f(x)
        if np.linalg.norm(g) < tol:
            print(f"Converged in {i} iterations.")
            return x
        H = hess_f(x)
        # Compute the Newton direction using conjugate gradient
        d, _ = solve(H, -g, method='cg', tol=tol)
        # Perform a line search using backtracking
        alpha = backtracking_line_search(f, grad_f, x, d)
        # Update x
        x = x + alpha*d
    print("Failed to converge.")
    return x

backtracking_line_search函数将待最小化的函数f、f的梯度grad_f、当前迭代x、搜索方向d以及用于初始步长、回溯参数和Armijo条件参数的可选参数alpharhoc作为输入,它使用回溯来找到满足Armijo条件的步长。
newton_cg函数类似于前面的实现,但是使用backtracking_line_search函数而不是固定步长来执行线搜索,这应该提高算法的收敛性。
下面是如何使用函数求解方程x3 - x2 - 1 = 0的示例:

def f(x):
    return x**3 - x**2 - 1

def grad_f(x):
    return 3*x**2 - 2*x

def hess_f(x):
    return 6*x - 2*np.eye(len(x))

x0 = np.array([1.0, 1.0])
x = newton_cg(f, x0, grad_f, hess_f)
print(x)

相关问题