Functions
Convex
- class PEPit.functions.ConvexFunction(_, is_leaf=True, decomposition_dict=None, reuse_gradient=False)[source]
Bases:
Function
The
ConvexFunction
class overwrites the add_class_constraints method ofFunction
, implementing the interpolation constraints of the class of convex, closed and proper (CCP) functions (i.e., convex functions whose epigraphs are non-empty closed sets).General CCP functions are not characterized by any parameter, hence can be instantiated as
Example
>>> from PEPit import PEP >>> problem = PEP() >>> func = problem.declare_function(function_class=ConvexFunction, param=dict())
- Parameters
is_leaf (bool) – True if self is defined from scratch. False is self is defined as linear combination of leaf .
decomposition_dict (dict) – decomposition of self as linear combination of leaf
Function
objects. Keys areFunction
objects and values are their associated coefficients.reuse_gradient (bool) – If True, the same subgradient is returned when one requires it several times on the same
Point
. If False, a new subgradient is computed each time one is required.
Strongly convex
- class PEPit.functions.StronglyConvexFunction(param, is_leaf=True, decomposition_dict=None, reuse_gradient=False)[source]
Bases:
Function
The
StronglyConvexFunction
class overwrites the add_class_constraints method ofFunction
, implementing the interpolation constraints of the class of strongly convex closed proper functions (strongly convex functions whose epigraphs are non-empty closed sets).- Attributes
mu (float) – strong convexity parameter
Strongly convex functions are characterized by the strong convexity parameter \(\mu\), hence can be instantiated as
Example
>>> from PEPit import PEP >>> problem = PEP() >>> func = problem.declare_function(function_class=StronglyConvexFunction, param={'mu': .1})
References
- Parameters
param (dict) – contains the values of mu
is_leaf (bool) – True if self is defined from scratch. False is self is defined as linear combination of leaf .
decomposition_dict (dict) – decomposition of self as linear combination of leaf
Function
objects. Keys areFunction
objects and values are their associated coefficients.reuse_gradient (bool) – If True, the same subgradient is returned when one requires it several times on the same
Point
. If False, a new subgradient is computed each time one is required.
Smooth
- class PEPit.functions.SmoothFunction(param, is_leaf=True, decomposition_dict=None, reuse_gradient=True)[source]
Bases:
Function
The
SmoothFunction
class overwrites the add_class_constraints method ofFunction
, implementing the interpolation constraints of the class of smooth (not necessarily convex) functions.- Attributes
L (float) – smoothness parameter
Smooth functions are characterized by the smoothness parameter L, hence can be instantiated as
Example
>>> from PEPit import PEP >>> problem = PEP() >>> func = problem.declare_function(function_class=SmoothFunction, param={'L': 1})
References
- Parameters
param (dict) – contains the values L
is_leaf (bool) – True if self is defined from scratch. False is self is defined as linear combination of leaf .
decomposition_dict (dict) – decomposition of self as linear combination of leaf
Function
objects. Keys areFunction
objects and values are their associated coefficients.reuse_gradient (bool) – If True, the same subgradient is returned when one requires it several times on the same
Point
. If False, a new subgradient is computed each time one is required.
Note
Smooth functions are necessarily differentiable, hence reuse_gradient is set to True.
Convex and smooth
- class PEPit.functions.SmoothConvexFunction(param, is_leaf=True, decomposition_dict=None, reuse_gradient=True)[source]
Bases:
SmoothStronglyConvexFunction
The
SmoothConvexFunction
implements smooth convex functions as particular cases ofSmoothStronglyConvexFunction
.- Attributes
L (float) – smoothness parameter
Smooth convex functions are characterized by the smoothness parameter L, hence can be instantiated as
Example
>>> from PEPit import PEP >>> problem = PEP() >>> func = problem.declare_function(function_class=SmoothConvexFunction, param={'L': 1})
- Parameters
param (dict) – contains the value of L
is_leaf (bool) – True if self is defined from scratch. False is self is defined as linear combination of leaf .
decomposition_dict (dict) – decomposition of self as linear combination of leaf
Function
objects. Keys areFunction
objects and values are their associated coefficients.reuse_gradient (bool) – If True, the same subgradient is returned when one requires it several times on the same
Point
. If False, a new subgradient is computed each time one is required.
Note
Smooth convex functions are necessarily differentiable, hence reuse_gradient is set to True.
Strongly convex and smooth
- class PEPit.functions.SmoothStronglyConvexFunction(param, is_leaf=True, decomposition_dict=None, reuse_gradient=True)[source]
Bases:
Function
The
SmoothStronglyConvexFunction
class overwrites the add_class_constraints method ofFunction
, by implementing interpolation constraints of the class of smooth strongly convex functions.- Attributes
mu (float) – strong convexity parameter
L (float) – smoothness parameter
Smooth strongly convex functions are characterized by parameters \(\mu\) and L, hence can be instantiated as
Example
>>> from PEPit import PEP >>> problem = PEP() >>> func = problem.declare_function(function_class=SmoothStronglyConvexFunction, param={'mu': .1, 'L': 1})
References
- Parameters
param (dict) – contains the values of mu and L
is_leaf (bool) – True if self is defined from scratch. False is self is defined as linear combination of leaf .
decomposition_dict (dict) – decomposition of self as linear combination of leaf
Function
objects. Keys areFunction
objects and values are their associated coefficients.reuse_gradient (bool) – If True, the same subgradient is returned when one requires it several times on the same
Point
. If False, a new subgradient is computed each time one is required.
Note
Smooth strongly convex functions are necessarily differentiable, hence reuse_gradient is set to True.
Convex and Lipschitz continuous
- class PEPit.functions.ConvexLipschitzFunction(param, is_leaf=True, decomposition_dict=None, reuse_gradient=False)[source]
Bases:
Function
The
ConvexLipschitzFunction
class overwrites the add_class_constraints method ofFunction
, implementing the interpolation constraints of the class of convex closed proper (CCP) Lipschitz continuous functions.- Attributes
M (float) – Lipschitz parameter
CCP Lipschitz continuous functions are characterized by a parameter M, hence can be instantiated as
Example
>>> from PEPit import PEP >>> problem = PEP() >>> func = problem.declare_function(function_class=ConvexLipschitzFunction, param={'M': 1})
References
- Parameters
param (dict) – contains the value of M
is_leaf (bool) – True if self is defined from scratch. False is self is defined as linear combination of leaf .
decomposition_dict (dict) – decomposition of self as linear combination of leaf
Function
objects. Keys areFunction
objects and values are their associated coefficients.reuse_gradient (bool) – If True, the same subgradient is returned when one requires it several times on the same
Point
. If False, a new subgradient is computed each time one is required.
Convex indicator
- class PEPit.functions.ConvexIndicatorFunction(param, is_leaf=True, decomposition_dict=None, reuse_gradient=False)[source]
Bases:
Function
The
ConvexIndicatorFunction
class overwrites the add_class_constraints method ofFunction
, implementing interpolation constraints for the class of closed convex indicator functions.- Attributes
D (float) – upper bound on the diameter of the feasible set
Convex indicator functions are characterized by a parameter D, hence can be instantiated as
Example
>>> from PEPit import PEP >>> problem = PEP() >>> func = problem.declare_function(function_class=ConvexIndicatorFunction, param={'D': 1})
References
- Parameters
param (dict) – contains the values of D
is_leaf (bool) – True if self is defined from scratch. False is self is defined as linear combination of leaf .
decomposition_dict (dict) – decomposition of self as linear combination of leaf
Function
objects. Keys areFunction
objects and values are their associated coefficients.reuse_gradient (bool) – If True, the same subgradient is returned when one requires it several times on the same
Point
. If False, a new subgradient is computed each time one is required.