6. Fixed point

6.1. Halpern iteration

PEPit.examples.fixed_point_problems.wc_halpern_iteration(n, verbose=True)[source]

Consider the fixed point problem

\[\mathrm{Find}\, x:\, x = Ax,\]

where \(A\) is a non-expansive operator, that is a \(L\)-Lipschitz operator with \(L=1\).

This code computes a worst-case guarantee for the Halpern Iteration. That is, it computes the smallest possible \(\tau(n)\) such that the guarantee

\[\|x_n - Ax_n\|^2 \leqslant \tau(n) \|x_0 - x_\star\|^2\]

is valid, where \(x_n\) is the output of the Halpern iteration, and \(x_\star\) the fixed point of \(A\).

In short, for a given value of \(n\), \(\tau(n)\) is computed as the worst-case value of \(\|x_n - Ax_n\|^2\) when \(\|x_0 - x_\star\|^2 \leqslant 1\).

Algorithm: The Halpern iteration can be written as

\[x_{t+1} = \frac{1}{t + 2} x_0 + \left(1 - \frac{1}{t + 2}\right) Ax_t.\]

Theoretical guarantee: A tight worst-case guarantee for Halpern iteration can be found in [1, Theorem 2.1]:

\[\|x_n - Ax_n\|^2 \leqslant \left(\frac{2}{n+1}\right)^2 \|x_0 - x_\star\|^2.\]

References: The detailed approach and tight bound are available in [1].

[1] F. Lieder (2021). On the convergence rate of the Halpern-iteration. Optimization Letters, 15(2), 405-418.

Parameters
  • n (int) – number of iterations.

  • verbose (bool) – if True, print conclusion.

Returns
  • pepit_tau (float) – worst-case value

  • theoretical_tau (float) – theoretical value

Example

>>> pepit_tau, theoretical_tau = wc_halpern_iteration(n=25, verbose=True)
(PEPit) Setting up the problem: size of the main PSD matrix: 28x28
(PEPit) Setting up the problem: performance measure is minimum of 1 element(s)
(PEPit) Setting up the problem: initial conditions (1 constraint(s) added)
(PEPit) Setting up the problem: interpolation conditions for 1 function(s)
         function 1 : 702 constraint(s) added
(PEPit) Compiling SDP
(PEPit) Calling SDP solver
(PEPit) Solver status: optimal (solver: SCS); optimal value: 0.005933984368783424
*** Example file: worst-case performance of Halpern Iterations ***
    PEPit guarantee:             ||xN - AxN||^2 <= 0.00593398 ||x0 - x_*||^2
    Theoretical guarantee:       ||xN - AxN||^2 <= 0.00591716 ||x0 - x_*||^2

6.2. Krasnoselskii Mann with constant step=sizes

PEPit.examples.fixed_point_problems.wc_krasnoselskii_mann_constant_step_sizes(n, gamma, verbose=True)[source]

Consider the fixed point problem

\[\mathrm{Find}\, x:\, x = Ax,\]

where \(A\) is a non-expansive operator, that is a \(L\)-Lipschitz operator with \(L=1\).

This code computes a worst-case guarantee for the Krasnolselskii-Mann (KM) method with constant step-size. That is, it computes the smallest possible \(\tau(n)\) such that the guarantee

\[\frac{1}{4}\|x_n - Ax_n\|^2 \leqslant \tau(n) \|x_0 - x_\star\|^2\]

is valid, where \(x_n\) is the output of the KM method, and \(x_\star\) is some fixed point of \(A\) (i.e., \(x_\star=Ax_\star\)).

Algorithm: The constant step-size KM method is described by

\[x_{t+1} = \left(1 - \gamma\right) x_{t} + \gamma Ax_{t}.\]

Theoretical guarantee: A theoretical upper bound is provided by [1, Theorem 4.9]

\[\begin{split}\tau(n) = \left\{ \begin{eqnarray} \frac{1}{n+1}\left(\frac{n}{n+1}\right)^n \frac{1}{4 \gamma (1 - \gamma)}\quad & \text{if } \frac{1}{2}\leqslant \gamma \leqslant \frac{1}{2}\left(1+\sqrt{\frac{n}{n+1}}\right) \\ (\gamma - 1)^{2n} \quad & \text{if } \frac{1}{2}\left(1+\sqrt{\frac{n}{n+1}}\right) < \gamma \leqslant 1. \end{eqnarray} \right.\end{split}\]

Reference:

[1] F. Lieder (2018). Projection Based Methods for Conic Linear Programming Optimal First Order Complexities and Norm Constrained Quasi Newton Methods. PhD thesis, HHU Düsseldorf.

Parameters
  • n (int) – number of iterations.

  • gamma (float) – step-size between 1/2 and 1

  • verbose (bool, optional) – if True, print conclusion

Returns
  • pepit_tau (float) – worst-case value

  • theoretical_tau (float) – theoretical value

Example

>>> pepit_tau, theoretical_tau = wc_krasnoselskii_mann_constant_step_sizes(n=3, gamma=3 / 4, verbose=True)
(PEPit) Setting up the problem: size of the main PSD matrix: 6x6
(PEPit) Setting up the problem: performance measure is minimum of 1 element(s)
(PEPit) Setting up the problem: initial conditions (1 constraint(s) added)
(PEPit) Setting up the problem: interpolation conditions for 1 function(s)
         function 1 : 20 constraint(s) added
(PEPit) Compiling SDP
(PEPit) Calling SDP solver
(PEPit) Solver status: optimal (solver: SCS); optimal value: 0.14062586461718285
*** Example file: worst-case performance of Kranoselskii-Mann iterations ***
    PEPit guarantee:             1/4||xN - AxN||^2 <= 0.140626 ||x0 - x_*||^2
    Theoretical guarantee:       1/4||xN - AxN||^2 <= 0.140625 ||x0 - x_*||^2

6.3. Krasnoselskii Mann with increasing step=sizes

PEPit.examples.fixed_point_problems.wc_krasnoselskii_mann_increasing_step_sizes(n, verbose=True)[source]

Consider the fixed point problem

\[\mathrm{Find}\, x:\, x = Ax,\]

where \(A\) is a non-expansive operator, that is a \(L\)-Lipschitz operator with \(L=1\).

This code computes a worst-case guarantee for the Krasnolselskii-Mann method. That is, it computes the smallest possible \(\tau(n)\) such that the guarantee

\[\frac{1}{4}\|x_n - Ax_n\|^2 \leqslant \tau(n) \|x_0 - x_\star\|^2\]

is valid, where \(x_n\) is the output of the KM method, and \(x_\star\) is some fixed point of \(A\) (i.e., \(x_\star=Ax_\star\)).

Algorithm: The KM method is described by

\[x_{t+1} = \frac{1}{t + 2} x_{t} + \left(1 - \frac{1}{t + 2}\right) Ax_{t}.\]

Reference: This scheme was first studied using PEPs in [1].

[1] F. Lieder (2018). Projection Based Methods for Conic Linear Programming Optimal First Order Complexities and Norm Constrained Quasi Newton Methods. PhD thesis, HHU Düsseldorf.

Parameters
  • n (int) – number of iterations.

  • verbose (bool, optional) – if True, print conclusion

Returns
  • pepit_tau (float) – worst-case value

  • theoretical_tau (None) – no theoretical value

Example

>>> pepit_tau, theoretical_tau = wc_krasnoselskii_mann_increasing_step_sizes(n=3, verbose=True)
(PEPit) Setting up the problem: size of the main PSD matrix: 6x6
(PEPit) Setting up the problem: performance measure is minimum of 1 element(s)
(PEPit) Setting up the problem: initial conditions (1 constraint(s) added)
(PEPit) Setting up the problem: interpolation conditions for 1 function(s)
         function 1 : 20 constraint(s) added
(PEPit) Compiling SDP
(PEPit) Calling SDP solver
(PEPit) Solver status: optimal (solver: SCS); optimal value: 0.11963406474148795
*** Example file: worst-case performance of Kranoselskii-Mann iterations ***
    PEPit guarantee:             1/4||xN - AxN||^2 <= 0.119634 ||x0 - x_*||^2