The Douglas–Rachford method: Maximally monotone + strongly monotone/cocoercive
Problem setup
Consider the monotone inclusion
where:
\(G_1 : \calH \rightrightarrows \calH\) is maximally monotone,
\(G_2 : \calH \to \calH\) is both \(\mu\)-strongly monotone and \(\beta\)-cocoercive, with \(\mu>0\) and \(\beta>0\).
For an initial point \(x^0 \in \calH\), step size \(\gamma \in \reals_{++}\), and relaxation \(\lambda \in \reals\), the Douglas–Rachford update is
In this example, we search for the smallest contraction factor \(\rho\in[0,1)\) provable using AutoLyap such that
where
Model the problem in AutoLyap and search for the smallest rho
\(G_1\) is modeled by
MaximallyMonotone.\(G_2\) is modeled as an intersection of
StronglyMonotoneandCocoercive.The full inclusion is built with
InclusionProblem.The update rule is represented by
DouglasRachford.Distance-to-solution parameters are obtained with
IterationIndependent.LinearConvergence.get_parameters_distance_to_solution.The contraction factor is searched with
IterationIndependent.LinearConvergence.bisection_search_rho.
Run the iteration-independent analysis
This example uses the MOSEK Fusion backend (backend="mosek_fusion").
Install the optional MOSEK dependency first:
pip install "autolyap[mosek]"
import numpy as np
from autolyap import SolverOptions
from autolyap.algorithms import DouglasRachford
from autolyap.problemclass import (
Cocoercive,
InclusionProblem,
MaximallyMonotone,
StronglyMonotone,
)
from autolyap.iteration_independent import IterationIndependent
mu = 1.0
beta = 0.5
gamma = 1.0
lambda_value = 2.0
problem = InclusionProblem(
[
MaximallyMonotone(), # G1
[StronglyMonotone(mu=mu), Cocoercive(beta=beta)], # G2 in intersection
]
)
algorithm = DouglasRachford(gamma=gamma, lambda_value=lambda_value, type="operator")
solver_options = SolverOptions(backend="mosek_fusion")
# License-free option:
# solver_options = SolverOptions(backend="cvxpy", cvxpy_solver="CLARABEL")
# Build V(P, p, k) and R(T, t, k) for distance-to-solution analysis.
P, T = IterationIndependent.LinearConvergence.get_parameters_distance_to_solution(
algorithm
)
result = IterationIndependent.LinearConvergence.bisection_search_rho(
problem,
algorithm,
P,
T,
S_equals_T=True,
s_equals_t=True,
remove_C3=True,
solver_options=solver_options,
)
if result["status"] != "feasible":
raise RuntimeError("No feasible Lyapunov certificate in the requested rho interval.")
rho_autolyap = result["rho"]
alpha = lambda_value / 2.0
delta = np.sqrt(
1.0 - (4.0 * gamma * mu) / (1.0 + 2.0 * gamma * mu + (gamma ** 2) * mu / beta)
)
rho_theory = (abs(1.0 - alpha) + alpha * delta) ** 2
print(f"rho (AutoLyap): {rho_autolyap:.8f}")
print(f"rho (theory): {rho_theory:.8f}")
What to inspect in result:
result["status"]:feasible,infeasible, ornot_solved.result["solve_status"]: raw backend status.result["rho"]: certified contraction factor when feasible.result["certificate"]: Lyapunov certificate matrices/scalars.
The computed value rho (AutoLyap) matches (up to solver numerical tolerances)
the theoretical rate expression in
[Gis17, Theorem 7.4], i.e.,
where
and
Sweeping over 100 values of \(\gamma\) on \(0 < \gamma \le 5\) (with \(\mu=1\), \(\beta=0.5\), \(\lambda=2\)) gives the plot below, with the theoretical rate in black and AutoLyap certificates as blue dots.
References
Pontus Giselsson. Tight global linear convergence rate bounds for Douglas–Rachford splitting. Journal of Fixed Point Theory and Applications, 19(4):2241–2270, 2017. doi:10.1007/s11784-017-0417-1.