Arend Torres, Fabricio. Mass conservative neural networks. 2024, Doctoral Thesis, University of Basel, Faculty of Science.
![]()
|
PDF
15Mb |
Official URL: https://edoc.unibas.ch/96821/
Downloads: Statistics Overview
Abstract
Neural networks have established themselves as a powerful tool for extracting insights from vast amounts of data. However, with increasing use
of deep learning in the natural sciences, there is also an increasing demand to incorporate domain-specific expert knowledge. In many cases, this
knowledge comes in the form of constraints expressed as partial differential
equations (PDE). In addition to possibly improving generalization capabilities, enforcing such constraints would ensure predictions that are consistent
with available domain knowledge. In this thesis, we focus on enforcing
the physical law of mass conservation in neural networks, with the aim of
modeling densities and velocities of compressible fluids. Specifically, we
enforce the continuity equation, a PDE describing mass conservation in its
local and differential form. We focus on models in continuous space and
time.
In our first contribution, we weakly enforce the continuity equation by
minimizing a PDE penalty on the so-called collocation points. Specifically,
we provide an extension to physics-informed neural networks (PINNs).
Motivated by the microscopic perspective of a fluid density, we propose
to select collocation points by sampling particles from the (normalized)
fluid density with dynamic Monte Carlo methods. This mesh-free and
adaptive sampling method improves the sample efficiency for enforcing
the continuity equation and other density-based advection-diffusion PDEs,
which we demonstrate through various experiments.
In our second contribution, we propose Lagrangian Flow Networks
(LFlows), a framework for constructing neural networks that adhere to the
continuity equation by construction. We do so by leveraging insights from
classical theory on Lagrangian flows, which allow us to model physically
consistent densities and velocities with time-conditioned diffeomorphisms,
i. e. conditional Normalizing Flows. This approach not only offers high predictive accuracy in density modeling tasks, but also proves computationally
efficient. We showcase LFlows in both 2D and 3D scenarios, and apply it to
the real-world application of bird migration modeling.
In summary, we study different approaches for incorporating PDEs,
particularly the continuity equation, into neural networks for modeling
compressible fluids. The resulting methods ensure physical consistency of
the predictions while maintaining computational efficiency.
of deep learning in the natural sciences, there is also an increasing demand to incorporate domain-specific expert knowledge. In many cases, this
knowledge comes in the form of constraints expressed as partial differential
equations (PDE). In addition to possibly improving generalization capabilities, enforcing such constraints would ensure predictions that are consistent
with available domain knowledge. In this thesis, we focus on enforcing
the physical law of mass conservation in neural networks, with the aim of
modeling densities and velocities of compressible fluids. Specifically, we
enforce the continuity equation, a PDE describing mass conservation in its
local and differential form. We focus on models in continuous space and
time.
In our first contribution, we weakly enforce the continuity equation by
minimizing a PDE penalty on the so-called collocation points. Specifically,
we provide an extension to physics-informed neural networks (PINNs).
Motivated by the microscopic perspective of a fluid density, we propose
to select collocation points by sampling particles from the (normalized)
fluid density with dynamic Monte Carlo methods. This mesh-free and
adaptive sampling method improves the sample efficiency for enforcing
the continuity equation and other density-based advection-diffusion PDEs,
which we demonstrate through various experiments.
In our second contribution, we propose Lagrangian Flow Networks
(LFlows), a framework for constructing neural networks that adhere to the
continuity equation by construction. We do so by leveraging insights from
classical theory on Lagrangian flows, which allow us to model physically
consistent densities and velocities with time-conditioned diffeomorphisms,
i. e. conditional Normalizing Flows. This approach not only offers high predictive accuracy in density modeling tasks, but also proves computationally
efficient. We showcase LFlows in both 2D and 3D scenarios, and apply it to
the real-world application of bird migration modeling.
In summary, we study different approaches for incorporating PDEs,
particularly the continuity equation, into neural networks for modeling
compressible fluids. The resulting methods ensure physical consistency of
the predictions while maintaining computational efficiency.
Advisors: | Roth, Volker |
---|---|
Committee Members: | Vetter, Thomas and Hoop, Maarten Valentijn de |
Faculties and Departments: | 05 Faculty of Science > Departement Mathematik und Informatik > Informatik > Biomedical Data Analysis (Roth) |
UniBasel Contributors: | Roth, Volker and Vetter, Thomas |
Item Type: | Thesis |
Thesis Subtype: | Doctoral Thesis |
Thesis no: | 15592 |
Thesis status: | Complete |
Number of Pages: | xv, 150 |
Language: | English |
Identification Number: |
|
edoc DOI: | |
Last Modified: | 04 Feb 2025 05:30 |
Deposited On: | 03 Feb 2025 11:18 |
Repository Staff Only: item control page