Inverse Evolution Layers: Physics-informed Regularizers for Deep Neural Networks

Chaoyu Liu, Zhonghua Qiao, Chao Li, Carola-Bibiane Schönlieb (Lead / Corresponding author)

Research output: Working paper/PreprintPreprint

Abstract

This paper proposes a novel approach to integrating partial differential equation (PDE)-based evolution models into neural networks through a new type of regularization. Specifically, we propose inverse evolution layers (IELs) based on evolution equations. These layers can achieve specific regularization objectives and endow neural networks’ outputs with corresponding properties of the evolution models. Moreover, IELs are straightforward to construct and implement, and can be easily designed for various physical evolutions and neural networks. Additionally, the design process for these layers can provide neural networks with intuitive and mathematical interpretability, thus enhancing the transparency and explainability of the approach. To demonstrate the effectiveness, efficiency, and simplicity of our approach, we present an example of endowing semantic segmentation models with the smoothness property based on the heat diffusion model. To achieve this goal, we design heat-diffusion IELs and apply them to address the challenge of semantic segmentation with noisy labels. The experimental results demonstrate that the heat-diffusion IELs can effectively mitigate the overfitting problem caused by noisy labels.
Original languageEnglish
PublisherarXiv
Number of pages13
DOIs
Publication statusPublished - 14 Jul 2023

Fingerprint

Dive into the research topics of 'Inverse Evolution Layers: Physics-informed Regularizers for Deep Neural Networks'. Together they form a unique fingerprint.

Cite this