Gheondea, AurelianTilki, C.Oates, Chris2025-02-192025-02-192024-121532-4435https://hdl.handle.net/11693/116422We prove some representer theorems for a localised version of a semisupervised, manifold regularised and multiview support vector machine learning problem introduced by H.Q. Minh, L. Bazzani, and V. Murino, Journal of Machine Learning Research, 17 (2016) 1-72, that involves operator valued positive semidefinite kernels and their reproducing kernel Hilbert spaces. The results concern general cases when convex or nonconvex loss functions and finite or infinite dimensional underlying Hilbert spaces are considered. We show that the general framework allows infinite dimensional Hilbert spaces and nonconvex loss functions for some special cases, in particular in case the loss functions are Gateaux differentiable. Detailed calculations are provided for the exponential least squares loss functions that lead to systems of partially nonlinear equations for which some Newton's approximation methods based on the interior point method can be used. Some numerical experiments are performed on a toy model that illustrate the tractability of the methods that we propose.EnglishCC BY 4.0 (Attribution 4.0 International Deed)https://creativecommons.org/licenses/by/4.0/Operator valued reproducing kernel Hilbert spacesManifold co-regularised and multiview learningSupport vector machine learningLoss functionsRepresenter theoremLocalisation of regularised and multiview support vector machine learningArticle