TY - JOUR
T1 - Manifold learning for parameter reduction
AU - Holiday, Alexander
AU - Kooshkbaghi, Mahdi
AU - Bello-Rivas, Juan M.
AU - William Gear, C.
AU - Zagaris, Antonios
AU - Kevrekidis, Ioannis G.
PY - 2019/9
Y1 - 2019/9
N2 - Large scale dynamical systems (e.g. many nonlinear coupled differential equations)can often be summarized in terms of only a few state variables (a few equations), a trait that reduces complexity and facilitates exploration of behavioral aspects of otherwise intractable models. High model dimensionality and complexity makes symbolic, pen–and–paper model reduction tedious and impractical, a difficulty addressed by recently developed frameworks that computerize reduction. Symbolic work has the benefit, however, of identifying both reduced state variables and parameter combinations that matter most (effective parameters, “inputs”); whereas current computational reduction schemes leave the parameter reduction aspect mostly unaddressed. As the interest in mapping out and optimizing complex input–output relations keeps growing, it becomes clear that combating the curse of dimensionality also requires efficient schemes for input space exploration and reduction. Here, we explore systematic, data-driven parameter reduction by means of effective parameter identification, starting from current nonlinear manifold-learning techniques enabling state space reduction. Our approach aspires to extend the data-driven determination of effective state variables with the data-driven discovery of effective model parameters, and thus to accelerate the exploration of high-dimensional parameter spaces associated with complex models.
AB - Large scale dynamical systems (e.g. many nonlinear coupled differential equations)can often be summarized in terms of only a few state variables (a few equations), a trait that reduces complexity and facilitates exploration of behavioral aspects of otherwise intractable models. High model dimensionality and complexity makes symbolic, pen–and–paper model reduction tedious and impractical, a difficulty addressed by recently developed frameworks that computerize reduction. Symbolic work has the benefit, however, of identifying both reduced state variables and parameter combinations that matter most (effective parameters, “inputs”); whereas current computational reduction schemes leave the parameter reduction aspect mostly unaddressed. As the interest in mapping out and optimizing complex input–output relations keeps growing, it becomes clear that combating the curse of dimensionality also requires efficient schemes for input space exploration and reduction. Here, we explore systematic, data-driven parameter reduction by means of effective parameter identification, starting from current nonlinear manifold-learning techniques enabling state space reduction. Our approach aspires to extend the data-driven determination of effective state variables with the data-driven discovery of effective model parameters, and thus to accelerate the exploration of high-dimensional parameter spaces associated with complex models.
KW - Data driven perturbation theory
KW - Data mining
KW - Diffusion maps
KW - Model reduction
KW - Parameter sloppiness
U2 - 10.1016/j.jcp.2019.04.015
DO - 10.1016/j.jcp.2019.04.015
M3 - Article
AN - SCOPUS:85065436223
VL - 392
SP - 419
EP - 431
JO - Journal of computational physics
JF - Journal of computational physics
SN - 0021-9991
ER -