Physics-Informed Machine Learning
Authors: Johanna Moser (Graz University of Technology) , Christopher Albert (Graz University of Technology) , Sascha Ranftl (Brown University)
Hybrid approaches combining differential equations and machine learning, commonly referred to as physics-informed machine learning, have gained significant attention in recent years. Prominent examples include Physics-Informed Neural Networks (PINNs) and Physics-Informed Gaussian Processes (PIGPs), the latter naturally providing uncertainty quantification. PIGPs encode differential constraints directly in the covariance kernel, and existing approaches can be roughly grouped into two schools of thought.
Operator-based constructions apply differential operators to a base kernel, yielding systematic and algorithmic methods, but often have restrictions of which systems can be represented, or performance issues. In contrast, Mercer-type constructions build kernels from problem-specific solution components such as Green’s functions or fundamental solutions; while typically data efficient, they rely on analytical insight and substantial manual derivation. We propose Monge-GPs, a hybrid construction based on Monge parametrization that unifies operator-based kernels and Mercer kernels using fundamental solutions. By parametrizing the controllable dynamics algorithmically and restricting problem-specific design to a low-dimensional autonomous component, the approach substantially reduces the need for manual kernel design, and stays data efficient while lifting the restriction to controllable systems.
Keywords:
How to Cite: Moser, J. , Albert, C. & Ranftl, S. (2026) “Introducing Monge-GPs: A new class of physics-informed Gaussian Processes (extended abstract)”, Proceedings of the Austrian Symposium on AI, Robotics, and Vision. 3(1).