5. Monotone inclusions

5.1. Proximal point

PEPit.examples.monotone_inclusions.wc_proximal_point(alpha, n, verbose=True)[source]

Consider the monotone inclusion problem

\[\mathrm{Find}\, x:\, 0\in Ax,\]

where \(A\) is maximally monotone. We denote \(J_A = (I + A)^{-1}\) the resolvents of \(A\).

This code computes a worst-case guarantee for the proximal point method. That, it computes the smallest possible \(\tau(n, \alpha)\) such that the guarantee

\[\|x_n - x_{n-1}\|^2 \leqslant \tau(n, \alpha) \|x_0 - x_\star\|^2,\]

is valid, where \(x_\star\) is such that \(0 \in Ax_\star\).

Algorithm: The proximal point algorithm for monotone inclusions is described as follows, for \(t \in \{ 0, \dots, n-1\}\),

\[x_{t+1} = J_{\alpha A}(x_t),\]

where \(\alpha\) is a step-size.

Theoretical guarantee: A tight theoretical guarantee can be found in [1, section 4].

\[\|x_n - x_{n-1}\|^2 \leqslant \frac{\left(1 - \frac{1}{n}\right)^{n - 1}}{n} \|x_0 - x_\star\|^2.\]

Reference:

[1] G. Gu, J. Yang (2020). Tight sublinear convergence rate of the proximal point algorithm for maximal monotone inclusion problem. SIAM Journal on Optimization, 30(3), 1905-1921.

Parameters
  • alpha (float) – the step-size.

  • n (int) – number of iterations.

  • verbose (bool, optional) – if True, print conclusion.

Returns
  • pepit_tau (float) – worst-case value.

  • theoretical_tau (float) – theoretical value.

Example

>>> pepit_tau, theoretical_tau = wc_proximal_point(alpha=2, n=10, verbose=True)
(PEPit) Setting up the problem: size of the main PSD matrix: 12x12
(PEPit) Setting up the problem: performance measure is minimum of 1 element(s)
(PEPit) Setting up the problem: initial conditions (1 constraint(s) added)
(PEPit) Setting up the problem: interpolation conditions for 1 function(s)
         function 1 : 110 constraint(s) added
(PEPit) Compiling SDP
(PEPit) Calling SDP solver
(PEPit) Solver status: optimal (solver: SCS); optimal value: 0.03874199421010509
*** Example file: worst-case performance of the Proximal Point Method***
    PEPit guarantee:             ||x(n) - x(n-1)||^2 <= 0.038742 ||x0 - xs||^2
    Theoretical guarantee:       ||x(n) - x(n-1)||^2 <= 0.038742 ||x0 - xs||^2

5.2. Accelerated proximal point

PEPit.examples.monotone_inclusions.wc_accelerated_proximal_point(alpha, n, verbose=True)[source]

Consider the monotone inclusion problem

\[\mathrm{Find}\, x:\, 0\in Ax,\]

where \(A\) is maximally monotone. We denote \(J_A = (I + A)^{-1}\) the resolvents of \(A\).

This code computes a worst-case guarantee for the accelerated proximal point method proposed in [1]. That, it computes the smallest possible \(\tau(n, \alpha)\) such that the guarantee

\[\|x_n - y_n\|^2 \leqslant \tau(n, \alpha) \|x_0 - x_\star\|^2,\]

is valid, where \(x_\star\) is such that \(0 \in Ax_\star\).

Algorithm: Accelerated proximal point is described as follows, for \(t \in \{ 0, \dots, n-1\}\)

\[\begin{split}\begin{eqnarray} x_{t+1} & = & J_{\alpha A}(y_t), \\ y_{t+1} & = & x_{t+1} + \frac{t}{t+2}(x_{t+1} - x_{t}) - \frac{t}{t+1}(x_t - y_{t-1}), \end{eqnarray}\end{split}\]

where \(x_0=y_0=y_{-1}\)

Theoretical guarantee: A tight theoretical worst-case guarantee can be found in [1, Theorem 4.1], for \(n \geqslant 1\),

\[\|x_n - y_{n-1}\|^2 \leqslant \frac{1}{n^2} \|x_0 - x_\star\|^2.\]

Reference:

[1] D. Kim (2021). Accelerated proximal point method for maximally monotone operators. Mathematical Programming, 1-31.

Parameters
  • alpha (float) – the step-size

  • n (int) – number of iterations.

  • verbose (bool, optional) – if True, print conclusion

Returns
  • pepit_tau (float) – worst-case value

  • theoretical_tau (float) – theoretical value

Example

>>> pepit_tau, theoretical_tau = wc_accelerated_proximal_point(alpha=2, n=10, verbose=True)
(PEPit) Setting up the problem: size of the main PSD matrix: 12x12
(PEPit) Setting up the problem: performance measure is minimum of 1 element(s)
(PEPit) Setting up the problem: initial conditions (1 constraint(s) added)
(PEPit) Setting up the problem: interpolation conditions for 1 function(s)
         function 1 : 110 constraint(s) added
(PEPit) Compiling SDP
(PEPit) Calling SDP solver
(PEPit) Solver status: optimal (solver: SCS); optimal value: 0.010000353550061647
*** Example file: worst-case performance of the Accelerated Proximal Point Method***
    PEPit guarantee:             ||x_n - y_n||^2 <= 0.0100004 ||x_0 - x_s||^2
    Theoretical guarantee:       ||x_n - y_n||^2 <= 0.01 ||x_0 - x_s||^2

5.3. Douglas Rachford Splitting

PEPit.examples.monotone_inclusions.wc_douglas_rachford_splitting(L, mu, alpha, theta, verbose=True)[source]

Consider the monotone inclusion problem

\[\mathrm{Find}\, x:\, 0\in Ax + Bx,\]

where \(A\) is \(L\)-Lipschitz and maximally monotone and \(B\) is (maximally) \(\mu\)-strongly monotone. We denote by \(J_{\alpha A}\) and \(J_{\alpha B}\) the resolvents of respectively A and B, with step-sizes \(\alpha\).

This code computes a worst-case guarantee for the Douglas-Rachford splitting (DRS). That is, given two initial points \(w^{(0)}_t\) and \(w^{(1)}_t\), this code computes the smallest possible \(\tau(L, \mu, \alpha, \theta)\) (a.k.a. “contraction factor”) such that the guarantee

\[\|w^{(0)}_{t+1} - w^{(1)}_{t+1}\|^2 \leqslant \tau(L, \mu, \alpha, \theta) \|w^{(0)}_{t} - w^{(1)}_{t}\|^2,\]

is valid, where \(w^{(0)}_{t+1}\) and \(w^{(1)}_{t+1}\) are obtained after one iteration of DRS from respectively \(w^{(0)}_{t}\) and \(w^{(1)}_{t}\).

In short, for given values of \(L\), \(\mu\), \(\alpha\) and \(\theta\), the contraction factor \(\tau(L, \mu, \alpha, \theta)\) is computed as the worst-case value of \(\|w^{(0)}_{t+1} - w^{(1)}_{t+1}\|^2\) when \(\|w^{(0)}_{t} - w^{(1)}_{t}\|^2 \leqslant 1\).

Algorithm: One iteration of the Douglas-Rachford splitting is described as follows, for \(t \in \{ 0, \dots, n-1\}\),

\begin{eqnarray} x_{t+1} & = & J_{\alpha B} (w_t),\\ y_{t+1} & = & J_{\alpha A} (2x_{t+1}-w_t),\\ w_{t+1} & = & w_t - \theta (x_{t+1}-y_{t+1}). \end{eqnarray}

Theoretical guarantee: Theoretical worst-case guarantees can be found in [1, section 4, Theorem 4.3]. Since the results of [2] tighten that of [1], we compare with [2, Theorem 4.3] below. The theoretical results are complicated and we do not copy them here.

References: The detailed PEP methodology for studying operator splitting is provided in [2].

[1] W. Moursi, L. Vandenberghe (2019). Douglas–Rachford Splitting for the Sum of a Lipschitz Continuous and a Strongly Monotone Operator. Journal of Optimization Theory and Applications 183, 179–198.

[2] E. Ryu, A. Taylor, C. Bergeling, P. Giselsson (2020). Operator splitting performance estimation: Tight contraction factors and optimal parameter selection. SIAM Journal on Optimization, 30(3), 2251-2271.

Parameters
  • L (float) – the Lipschitz parameter.

  • mu (float) – the strongly monotone parameter.

  • alpha (float) – the step-size in the resolvent.

  • theta (float) – algorithm parameter.

  • verbose (bool) – if True, print conclusion.

Returns
  • pepit_tau (float) – worst-case value.

  • theoretical_tau (float) – theoretical value.

Example

>>> pepit_tau, theoretical_tau  = wc_douglas_rachford_splitting(L=1, mu=.1, alpha=1.3, theta=.9, verbose=True)
(PEPit) Setting up the problem: size of the main PSD matrix: 6x6
(PEPit) Setting up the problem: performance measure is minimum of 1 element(s)
(PEPit) Setting up the problem: initial conditions (1 constraint(s) added)
(PEPit) Setting up the problem: interpolation conditions for 2 function(s)
         function 1 : 4 constraint(s) added
         function 2 : 2 constraint(s) added
(PEPit) Compiling SDP
(PEPit) Calling SDP solver
(PEPit) Solver status: optimal (solver: SCS); optimal value: 0.928770693164459
*** Example file: worst-case performance of the Douglas Rachford Splitting***
    PEPit guarantee:             ||w_(t+1)^0 - w_(t+1)^1||^2 <= 0.928771 ||w_(t)^0 - w_(t)^1||^2
    Theoretical guarantee:       ||w_(t+1)^0 - w_(t+1)^1||^2 <= 0.928771 ||w_(t)^0 - w_(t)^1||^2

5.4. Three operator splitting

PEPit.examples.monotone_inclusions.wc_three_operator_splitting(L, mu, beta, alpha, theta, verbose=True)[source]

Consider the monotone inclusion problem

\[\mathrm{Find}\, x:\, 0\in Ax + Bx + Cx,\]

where \(A\) is maximally monotone, \(B\) is \(\beta\)-cocoercive and C is the gradient of some \(L\)-smooth \(\mu\)-strongly convex function. We denote by \(J_{\alpha A}\) and \(J_{\alpha B}\) the resolvent of respectively \(A\) and \(B\), with step-size \(\alpha\).

This code computes a worst-case guarantee for the three operator splitting (TOS). That is, given two initial points \(w^{(0)}_t\) and \(w^{(1)}_t\), this code computes the smallest possible \(\tau(L, \mu, \beta, \alpha, \theta)\) (a.k.a. “contraction factor”) such that the guarantee

\[\|w^{(0)}_{t+1} - w^{(1)}_{t+1}\|^2 \leqslant \tau(L, \mu, \beta, \alpha, \theta) \|w^{(0)}_{t} - w^{(1)}_{t}\|^2,\]

is valid, where \(w^{(0)}_{t+1}\) and \(w^{(1)}_{t+1}\) are obtained after one iteration of TOS from respectively \(w^{(0)}_{t}\) and \(w^{(1)}_{t}\).

In short, for given values of \(L\), \(\mu\), \(\beta\), \(\alpha\) and \(\theta\), the contraction factor \(\tau(L, \mu, \beta, \alpha, \theta)\) is computed as the worst-case value of \(\|w^{(0)}_{t+1} - w^{(1)}_{t+1}\|^2\) when \(\|w^{(0)}_{t} - w^{(1)}_{t}\|^2 \leqslant 1\).

Algorithm: One iteration of the algorithm is described in [1]. For \(t \in \{ 0, \dots, n-1\}\),

\begin{eqnarray} x_{t+1} & = & J_{\alpha B} (w_t),\\ y_{t+1} & = & J_{\alpha A} (2x_{t+1} - w_t - C x_{t+1}),\\ w_{t+1} & = & w_t - \theta (x_{t+1} - y_{t+1}). \end{eqnarray}

References: The TOS was proposed in [1], the analysis of such operator splitting methods using PEPs was proposed in [2].

[1] D. Davis, W. Yin (2017). A three-operator splitting scheme and its optimization applications. Set-valued and variational analysis, 25(4), 829-858.

[2] E. Ryu, A. Taylor, C. Bergeling, P. Giselsson (2020). Operator splitting performance estimation: Tight contraction factors and optimal parameter selection. SIAM Journal on Optimization, 30(3), 2251-2271.

Parameters
  • L (float) – smoothness constant of C.

  • mu (float) – strong convexity of C.

  • beta (float) – cocoercivity of B.

  • alpha (float) – step-size (in the resolvants).

  • theta (float) – overrelaxation parameter.

  • verbose (bool) – if True, print conclusion.

Returns
  • pepit_tau (float) – worst-case value.

  • theoretical_tau (None) – no theoretical value.

Example

>>> pepit_tau, theoretical_tau = wc_three_operator_splitting(L=1, mu=.1, beta=1, alpha=.9, theta=1.3, verbose=True)
(PEPit) Setting up the problem: size of the main PSD matrix: 8x8
(PEPit) Setting up the problem: performance measure is minimum of 1 element(s)
(PEPit) Setting up the problem: initial conditions (1 constraint(s) added)
(PEPit) Setting up the problem: interpolation conditions for 3 function(s)
         function 1 : 2 constraint(s) added
         function 2 : 2 constraint(s) added
         function 3 : 2 constraint(s) added
(PEPit) Compiling SDP
(PEPit) Calling SDP solver
(PEPit) Solver status: optimal (solver: MOSEK); optimal value: 0.7796890317399627
*** Example file: worst-case contraction factor of the Three Operator Splitting ***
        PEPit guarantee:         ||w_(t+1)^0 - w_(t+1)^1||^2 <= 0.779689 ||w_(t)^0 - w_(t)^1||^2