Abstract:
Polynomial Lyapunov function $\mathcal{V}(\bf x)$ provides mathematically rigorous that converts stability analysis into efficiently solvable optimization problem. Traditional numerical methods rely on user-defined templates, while emerging neural $\mathcal{V}(\bf x)$ offer flexibility but exhibit poor generalization yield from naive {\it Square} polynomial networks. In this paper, we propose a novel learning-enabled polynomial $\mathcal{V}(\bf x)$ synthesis approach, where a data-driven machine learning process guided by target-based sampling to fit candidate $\mathcal{V}(\bf x)$ which naturally compatible with the sum-of-squares (SOS) soundness verification. The framework is structured as an iterative loop between a {\it Learner} and a {\it Verifier}, where the {\it Learner} trains expressive polynomial $\mathcal{V}(\bf x)$ network via polynomial expansions, while the {\it Verifier} encodes learned candidates with SOS constraints to identify a real $\mathcal{V}(\bf x)$ by solving LMI feasibility test problems. The entire procedure is driven by a high-accuracy counterexample guidance technique to further enhance efficiency. Experimental results demonstrate that our approach outperforms both SMT-based polynomial neural Lyapunov function synthesis and traditional SOS method.
Live content is unavailable. Log in and register to view live content