Mutual information for explainable deep learning of multiscale systems

Søren Taverniers, Eric J. Hall, Markos A. Katsoulakis, Daniel M. Tartakovsky

Research output: Contribution to journalArticlepeer-review

Abstract

Timely completion of design cycles for complex systems ranging from consumer electronics to hypersonic vehicles relies on rapid simulation-based prototyping. The latter typically involves high-dimensional spaces of possibly correlated control variables (CVs) and quantities of interest (QoIs) with non-Gaussian and possibly multimodal distributions. We develop a model-agnostic, moment-independent global sensitivity analysis (GSA) that relies on differential mutual information to rank the effects of CVs on QoIs. The data requirements of this information-theoretic approach to GSA are met by replacing computationally intensive components of the physics-based model with a deep neural network surrogate. Subsequently, the GSA is used to explain the surrogate predictions, and the surrogate-driven GSA is deployed as an uncertainty quantification emulator to close design loops. Viewed as an uncertainty quantification method for interrogating the surrogate, this framework is compatible with a wide variety of black-box models. We demonstrate that the surrogate-driven mutual information GSA provides useful and distinguishable rankings via a validation step for applications of interest in energy storage. Consequently, our information-theoretic GSA provides an “outer loop” for accelerated product design by identifying the most and least sensitive input directions and performing subsequent optimization over appropriately reduced parameter subspaces.

Original languageEnglish
Article number110551
Number of pages19
JournalJournal of Computational Physics
Volume444
Early online date9 Jul 2021
DOIs
Publication statusE-pub ahead of print - 9 Jul 2021

Keywords

  • Surrogate model
  • Mutual information
  • Global sensitivity analysis
  • Black box
  • Probabilistic graphical model
  • Electrical double-layer capacitor

Fingerprint

Dive into the research topics of 'Mutual information for explainable deep learning of multiscale systems'. Together they form a unique fingerprint.

Cite this