2025 : 8 : 30
Omid Chatrabgoun

Omid Chatrabgoun

Academic rank: Associate Professor
ORCID:
Education: PhD.
ScopusId:
HIndex:
Faculty: Mathematical Sciences and Statistics
Address:
Phone:

Research

Title
Sparse Gaussian process regression via compactly supported kernels: A trade-off between accuracy and computational efficiency
Type
JournalPaper
Keywords
Compact support kernels, Gaussian process regression, Modified maximum likelihood estimation, Sparsification functional
Year
2025
Journal Information Sciences
DOI
Researchers Mohsen Esmaeilbeigi ، Omid Chatrabgoun ، Roberto Cavoretto ، Alessandra De Rossi ، Maryam Shafa

Abstract

Gaussian Process Regression (GPR) is a powerful kernel-based learning model, but its application to large datasets is hindered by the computational demands of handling a full kernel matrix. This paper introduces a novel approach to mitigate these challenges by employing Compactly Supported Radial Kernels (CSRKs) to create a sparse kernel matrix. CSRKs help to improve the conditioning of the kernel matrix during GPR training and prediction, thereby reducing both computational costs and memory usage. One key aspect of using CSRKs is the ability to control the sparsity of the kernel matrix by adjusting the kernel’s support. However, a significant challenge arises due to a trade-off between maintaining sparsity and achieving high prediction accuracy. Our work explores this trade-off in detail, demonstrating that the advantage of CSRKs in inducing sparsity can be diminished if accuracy is prioritized. To address this, we propose the integration of expert knowledge through a suitable sparsification functional (SF) during GPR training, coupled with Maximum Likelihood Estimation (MLE). This approach aims to strike a balance between accuracy and sparsity, leading to more efficient learning. We validate our method through extensive numerical simulations on both 2D regular and irregular domains with noisy datasets. Additionally, we conduct a comparative study on real data to assess the impact of our modified MLE on the predictive performance of GPR models. The results consistently show that our proposed method yields sparser, better-conditioned kernel matrices compared to traditional techniques.