Accelerated Nonconvex ADMM with Self-Adaptive Penalty for Rank-Constrained Model Identification
Autor: | Liu, Qingyuan, Huang, Zhengchao, Ye, Hao, Huang, Dexian, Shang, Chao |
---|---|
Rok vydání: | 2023 |
Předmět: | |
Druh dokumentu: | Working Paper |
Popis: | The alternating direction method of multipliers (ADMM) has been widely adopted in low-rank approximation and low-order model identification tasks; however, the performance of nonconvex ADMM is highly reliant on the choice of penalty parameter. To accelerate ADMM for solving rank-constrained identification problems, this paper proposes a new self-adaptive strategy for automatic penalty update. Guided by first-order analysis of the increment of the augmented Lagrangian, the self-adaptive penalty updating enables effective and balanced minimization of both primal and dual residuals and thus ensures a stable convergence. Moreover, improved efficiency can be obtained within the Anderson acceleration scheme. Numerical examples show that the proposed strategy significantly accelerates the convergence of nonconvex ADMM while alleviating the critical reliance on tedious tuning of penalty parameters. Comment: 7 pages, 5 figures. Accepted by 62nd IEEE Conference on Decision and Control (CDC 2023) |
Databáze: | arXiv |
Externí odkaz: |