Download this article
Download this article For screen
For printing
Recent Issues
Volume 19, Issue 1
Volume 18, Issue 1
Volume 17, Issue 1
Volume 16, Issue 2
Volume 16, Issue 1
Volume 15, Issue 2
Volume 15, Issue 1
Volume 14, Issue 2
Volume 14, Issue 1
Volume 13, Issue 2
Volume 13, Issue 1
Volume 12, Issue 1
Volume 11, Issue 2
Volume 11, Issue 1
Volume 10, Issue 2
Volume 10, Issue 1
Volume 9, Issue 2
Volume 9, Issue 1
Volume 8, Issue 1
Volume 7, Issue 2
Volume 7, Issue 1
Volume 6, Issue 1
Volume 5, Issue 2
Volume 5, Issue 1
Volume 4, Issue 1
Volume 3, Issue 1
Volume 2, Issue 1
Volume 1, Issue 1
The Journal
About the journal
Ethics and policies
Peer-review process
 
Submission guidelines
Submission form
Editorial board
 
Subscriptions
 
ISSN 2157-5452 (electronic)
ISSN 1559-3940 (print)
 
Author index
To appear
 
Other MSP journals
Advanced stationary and nonstationary kernel designs for domain-aware Gaussian processes

Marcus M. Noack and James A. Sethian

Vol. 17 (2022), No. 1, 131–156
Abstract

Gaussian process regression is a widely applied method for function approximation and uncertainty quantification. The technique has recently gained popularity in the machine learning community due to its robustness and interpretability. The mathematical methods we discuss in this paper are an extension of the Gaussian process framework. We are proposing advanced kernel designs that only allow for functions with certain desirable characteristics to be elements of the reproducing kernel Hilbert space (RKHS) that underlies all kernel methods and serves as the sample space for Gaussian process regression. These desirable characteristics reflect the underlying physics; two obvious examples are symmetry and periodicity constraints. In addition, we want to draw attention to nonstationary kernel designs that can be defined in the same framework to yield flexible multitask Gaussian processes. We will show the impact of advanced kernel designs on Gaussian processes using several synthetic and two scientific data sets. The results show that informing a Gaussian process of domain knowledge, combined with additional flexibility and communicated through advanced kernel designs, has a significant impact on the accuracy and relevance of the function approximation.

Keywords
Gaussian processes, machine learning, uncertainty quantification
Mathematical Subject Classification
Primary: 60-08, 60G15
Milestones
Received: 12 November 2021
Revised: 8 March 2022
Accepted: 15 April 2022
Published: 7 October 2022
Authors
Marcus M. Noack
The Center for Advanced Mathematics for Energy Research Applications (CAMERA)
Lawrence Berkeley National Laboratory
Berkeley, CA
United States
James A. Sethian
The Center for Advanced Mathematics for Energy Research Applications (CAMERA)
Lawrence Berkeley National Laboratory
Berkeley, CA
United States
Department of Mathematics
University of California, Berkeley
Berkeley, CA
United States