Manifold learning for parameter reduction

Alexander Holiday, Mahdi Kooshkbaghi, Juan M. Bello-Rivas, C. William Gear, Antonios Zagaris, Ioannis G. Kevrekidis

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

Large scale dynamical systems (e.g. many nonlinear coupled differential equations)can often be summarized in terms of only a few state variables (a few equations), a trait that reduces complexity and facilitates exploration of behavioral aspects of otherwise intractable models. High model dimensionality and complexity makes symbolic, pen–and–paper model reduction tedious and impractical, a difficulty addressed by recently developed frameworks that computerize reduction. Symbolic work has the benefit, however, of identifying both reduced state variables and parameter combinations that matter most (effective parameters, “inputs”); whereas current computational reduction schemes leave the parameter reduction aspect mostly unaddressed. As the interest in mapping out and optimizing complex input–output relations keeps growing, it becomes clear that combating the curse of dimensionality also requires efficient schemes for input space exploration and reduction. Here, we explore systematic, data-driven parameter reduction by means of effective parameter identification, starting from current nonlinear manifold-learning techniques enabling state space reduction. Our approach aspires to extend the data-driven determination of effective state variables with the data-driven discovery of effective model parameters, and thus to accelerate the exploration of high-dimensional parameter spaces associated with complex models.

LanguageEnglish
Pages419-431
JournalJournal of computational physics
Volume392
DOIs
Publication statusPublished - Sep 2019

Fingerprint

Manifold Learning
learning
Data-driven
Curse of Dimensionality
Model Reduction
pens
Parameter Identification
parameter identification
space exploration
Model
Accelerate
Dimensionality
Parameter Space
State Space
dynamical systems
High-dimensional
Dynamical system
Identification (control systems)
Dynamical systems
Differential equation

Keywords

  • Data driven perturbation theory
  • Data mining
  • Diffusion maps
  • Model reduction
  • Parameter sloppiness

Cite this

Holiday, A., Kooshkbaghi, M., Bello-Rivas, J. M., William Gear, C., Zagaris, A., & Kevrekidis, I. G. (2019). Manifold learning for parameter reduction. Journal of computational physics, 392, 419-431. https://doi.org/10.1016/j.jcp.2019.04.015
Holiday, Alexander ; Kooshkbaghi, Mahdi ; Bello-Rivas, Juan M. ; William Gear, C. ; Zagaris, Antonios ; Kevrekidis, Ioannis G. / Manifold learning for parameter reduction. In: Journal of computational physics. 2019 ; Vol. 392. pp. 419-431.
@article{25aaf0adfe754e818cd6a416b75cd2d5,
title = "Manifold learning for parameter reduction",
abstract = "Large scale dynamical systems (e.g. many nonlinear coupled differential equations)can often be summarized in terms of only a few state variables (a few equations), a trait that reduces complexity and facilitates exploration of behavioral aspects of otherwise intractable models. High model dimensionality and complexity makes symbolic, pen–and–paper model reduction tedious and impractical, a difficulty addressed by recently developed frameworks that computerize reduction. Symbolic work has the benefit, however, of identifying both reduced state variables and parameter combinations that matter most (effective parameters, “inputs”); whereas current computational reduction schemes leave the parameter reduction aspect mostly unaddressed. As the interest in mapping out and optimizing complex input–output relations keeps growing, it becomes clear that combating the curse of dimensionality also requires efficient schemes for input space exploration and reduction. Here, we explore systematic, data-driven parameter reduction by means of effective parameter identification, starting from current nonlinear manifold-learning techniques enabling state space reduction. Our approach aspires to extend the data-driven determination of effective state variables with the data-driven discovery of effective model parameters, and thus to accelerate the exploration of high-dimensional parameter spaces associated with complex models.",
keywords = "Data driven perturbation theory, Data mining, Diffusion maps, Model reduction, Parameter sloppiness",
author = "Alexander Holiday and Mahdi Kooshkbaghi and Bello-Rivas, {Juan M.} and {William Gear}, C. and Antonios Zagaris and Kevrekidis, {Ioannis G.}",
year = "2019",
month = "9",
doi = "10.1016/j.jcp.2019.04.015",
language = "English",
volume = "392",
pages = "419--431",
journal = "Journal of computational physics",
issn = "0021-9991",
publisher = "Elsevier",

}

Holiday, A, Kooshkbaghi, M, Bello-Rivas, JM, William Gear, C, Zagaris, A & Kevrekidis, IG 2019, 'Manifold learning for parameter reduction', Journal of computational physics, vol. 392, pp. 419-431. https://doi.org/10.1016/j.jcp.2019.04.015

Manifold learning for parameter reduction. / Holiday, Alexander; Kooshkbaghi, Mahdi; Bello-Rivas, Juan M.; William Gear, C.; Zagaris, Antonios; Kevrekidis, Ioannis G.

In: Journal of computational physics, Vol. 392, 09.2019, p. 419-431.

Research output: Contribution to journalArticleAcademicpeer-review

TY - JOUR

T1 - Manifold learning for parameter reduction

AU - Holiday, Alexander

AU - Kooshkbaghi, Mahdi

AU - Bello-Rivas, Juan M.

AU - William Gear, C.

AU - Zagaris, Antonios

AU - Kevrekidis, Ioannis G.

PY - 2019/9

Y1 - 2019/9

N2 - Large scale dynamical systems (e.g. many nonlinear coupled differential equations)can often be summarized in terms of only a few state variables (a few equations), a trait that reduces complexity and facilitates exploration of behavioral aspects of otherwise intractable models. High model dimensionality and complexity makes symbolic, pen–and–paper model reduction tedious and impractical, a difficulty addressed by recently developed frameworks that computerize reduction. Symbolic work has the benefit, however, of identifying both reduced state variables and parameter combinations that matter most (effective parameters, “inputs”); whereas current computational reduction schemes leave the parameter reduction aspect mostly unaddressed. As the interest in mapping out and optimizing complex input–output relations keeps growing, it becomes clear that combating the curse of dimensionality also requires efficient schemes for input space exploration and reduction. Here, we explore systematic, data-driven parameter reduction by means of effective parameter identification, starting from current nonlinear manifold-learning techniques enabling state space reduction. Our approach aspires to extend the data-driven determination of effective state variables with the data-driven discovery of effective model parameters, and thus to accelerate the exploration of high-dimensional parameter spaces associated with complex models.

AB - Large scale dynamical systems (e.g. many nonlinear coupled differential equations)can often be summarized in terms of only a few state variables (a few equations), a trait that reduces complexity and facilitates exploration of behavioral aspects of otherwise intractable models. High model dimensionality and complexity makes symbolic, pen–and–paper model reduction tedious and impractical, a difficulty addressed by recently developed frameworks that computerize reduction. Symbolic work has the benefit, however, of identifying both reduced state variables and parameter combinations that matter most (effective parameters, “inputs”); whereas current computational reduction schemes leave the parameter reduction aspect mostly unaddressed. As the interest in mapping out and optimizing complex input–output relations keeps growing, it becomes clear that combating the curse of dimensionality also requires efficient schemes for input space exploration and reduction. Here, we explore systematic, data-driven parameter reduction by means of effective parameter identification, starting from current nonlinear manifold-learning techniques enabling state space reduction. Our approach aspires to extend the data-driven determination of effective state variables with the data-driven discovery of effective model parameters, and thus to accelerate the exploration of high-dimensional parameter spaces associated with complex models.

KW - Data driven perturbation theory

KW - Data mining

KW - Diffusion maps

KW - Model reduction

KW - Parameter sloppiness

U2 - 10.1016/j.jcp.2019.04.015

DO - 10.1016/j.jcp.2019.04.015

M3 - Article

VL - 392

SP - 419

EP - 431

JO - Journal of computational physics

T2 - Journal of computational physics

JF - Journal of computational physics

SN - 0021-9991

ER -

Holiday A, Kooshkbaghi M, Bello-Rivas JM, William Gear C, Zagaris A, Kevrekidis IG. Manifold learning for parameter reduction. Journal of computational physics. 2019 Sep;392:419-431. https://doi.org/10.1016/j.jcp.2019.04.015