Refined Subspace Projection for Model Reduction via Interpolation and Tensor Decomposition
成果类型:
Article
署名作者:
Jiang, Yao-Lin; Zhang, Guo-Yun
署名单位:
Xi'an Jiaotong University
刊物名称:
IEEE TRANSACTIONS ON AUTOMATIC CONTROL
ISSN/ISSBN:
0018-9286
DOI:
10.1109/TAC.2024.3514517
发表日期:
2025
页码:
3771-3783
关键词:
Tensors
Nonlinear systems
POLYNOMIALS
kernel
Eigenvalues and eigenfunctions
mathematical models
Linear systems
interpolation
vectors
Stability criteria
Kernel matching
low-rank approximation
model order reduction (MOR)
parallel
projection-based
tensor decomposition
摘要:
Research on nonlinear model order reduction has revealed that as nonlinearity increases, the subspaces capturing dominant information require more complex bases. The complexity is influenced by two main factors: coefficients and approximation criteria. On one hand, it is affected by the characteristics of all coefficients. Therefore, we begin by introducing a generalized Gramian-based method for estimating eigenvalue decay, which demonstrates the factors on the reduced order. On the other hand, existing interpolation conditions based on transfer functions must match multiple features with different bases. This article confirms that kernels, as new criteria, can match all features of a large class of nonlinear systems using the same basis as the linear part. We propose a $\mathit{k}$-dimensional refined space for affine input/output nonlinear systems, in contrast to existing methods that require at least an $\mathit{O(nk)}$-dimensional space to match $\mathit{n}$ transfer functions at $\mathit{k}$ interpolating points, even for $\mathbf{0}$th moment matching. To expand the applicability, we present a rank-$\mathbf{R}$ quadratic approximation to transform general systems into normalized affine input/output nonlinear systems. In terms of computational efficiency, we propose a parallel partial columnwise least-squares method to further reduce the rank. Finally, we provide two numerical examples to illustrate the effectiveness of our method.