AutoLyap
v0.2.0
  • Home
  • Quick start
  • Theory
  • Examples
    • Define your own algorithm
    • Nesterov’s fast gradient method
    • The accelerated proximal point method
    • The Chambolle–Pock method
    • The Davis–Yin three-operator splitting method
    • The Douglas–Rachford method
      • Cocoercive + strongly monotone
      • Maximally monotone + strongly monotone/cocoercive
      • Maximally monotone + strongly monotone/Lipschitz
      • Maximally monotone/Lipschitz + strongly monotone
      • Smooth and strongly convex + convex
    • The gradient method
    • The gradient method with constant Nesterov momentum
    • The heavy-ball method
    • The information-theoretic exact method
    • The Malitsky–Tam forward-reflected-backward method
    • The optimized gradient method
    • The proximal point method
  • API reference
  • Contributing
  • What’s new
AutoLyap
  • Examples
  • The Douglas–Rachford method

The Douglas–Rachford method

This section collects Douglas–Rachford examples under different problem settings.

  • Cocoercive + strongly monotone
  • Maximally monotone + strongly monotone/cocoercive
  • Maximally monotone + strongly monotone/Lipschitz
  • Maximally monotone/Lipschitz + strongly monotone
  • Smooth and strongly convex + convex
Previous Next

© Copyright 2026, Manu Upadhyaya.