Publications
See the Google Scholar for the full list of my publications.
2024
- Accelerated Saddle Flow Dynamics for Bilinearly Coupled Minimax ProblemsYingzhu Liu, Enrique Mallada, Zhongkui Li, and Pengcheng You2024
Minimax problems have attracted much attention due to various applications in constrained optimization problems and zero-sum games. Identifying saddle points within these problems is crucial, and saddle flow dynamics offer a straightforward yet useful approach. This study focuses on a class of bilinearly coupled minimax problems with strongly convex-linear objective functions. We design an accelerated algorithm based on saddle flow dynamics, achieving a convergence rate beyond the stereotype limit (the strong convexity constant). The algorithm is derived from a sequential two-step transformation of a given objective function. First, a change of variables is applied to render the objective function better conditioned, introducing strong concavity (from linearity) while preserving strong convexity. Second, proximal regularization, when staggered with the first step, further enhances the strong convexity of the objective function by shifting some of the obtained strong concavity. After these transformations, saddle flow dynamics based on the new objective function can be tuned for accelerated exponential convergence. Besides, such an approach can be extended to weakly convex-weakly concave functions and still guarantees exponential convergence to one stationary point. The theory is verified by a numerical test on an affine equality-constrained convex optimization problem.
- A Unified Analysis of Saddle Flow Dynamics: Stability and Algorithm DesignPengcheng You, Yingzhu Liu, and Enrique Mallada2024
This work examines the conditions for asymptotic and exponential convergence of saddle flow dynamics of convex-concave functions. First, we propose an observability-based certificate for asymptotic convergence, directly bridging the gap between the invariant set in a LaSalle argument and the equilibrium set of saddle flows. This certificate generalizes conventional conditions for convergence, e.g., strict convexity-concavity, and leads to a novel state-augmentation method that requires minimal assumptions for asymptotic convergence. We also show that global exponential stability follows from strong convexity-strong concavity, providing a lower-bound estimate of the convergence rate. This insight also explains the convergence of proximal saddle flows for strongly convex-concave objective functions. Our results generalize to dynamics with projections on the vector field and have applications in solving constrained convex optimization via primal-dual methods. Based on these insights, we study four algorithms built upon different Lagrangian function transformations. We validate our work by applying these methods to solve a network flow optimization and a Lasso regression problem.