In the machine learning task “regression”, we would like to “train” a model that explains the mapping of given high-dimensional input space X to observations in an observation space Y. That is, the trained model is supposed to give predictions for unseen input in X such that, for a query input, we get a prediction that follows the hidden relation between X and Y.
One of my interests is in “kernel ridge regression”, which is equivalent to approximation in reproducing kernel Hilbert spaces with regularization. My specific interest is in multi-fidelity learning, fast training methods and applications in scientific computing.
- M. Griebel, Ch. Rieger, P. Zaspel. Kernel-based stochastic collocation for the random two-phase Navier-Stokes equations. Accepted for publication in International Journal for Uncertainty Quantification, April 2019; also available as arXiv:1810.11270.
- P. Zaspel, B. Huang, H. Harbrecht, O. A. von Lilienfeld. Boosting quantum machine learning models with multi-level combination technique: Pople diagrams revisited. Journal of Chemical Theory and Computation, 15(3):1546-1559, 2019; also available as arXiv:1808.02799.
- P. Zaspel. Algorithmic patterns for H matrices on many-core processors. Journal of Scientific Computing, Springer, 78(2):1174-1206, 2019; also available as Preprint 2017-12, Fachbereich Mathematik, Universität Basel, Switzerland, 2017 and as arXiv:1708.09707 preprint.
- P. Zaspel. Parallel RBF Kernel-Based Stochastic Collocation for Large-Scale Random PDEs, PhD Thesis, Institute for Numerical Simulation, University of Bonn, Germany, Apr. 2015