Nourmohammadi, SepehrRahimzadeh Arashloo, ShervinKittler, Josef2025-02-212025-02-212024-10-161566-2535https://hdl.handle.net/11693/116557Classifier fusion is established as an effective methodology for boosting performance in different classification settings and one-class classification is no exception. In this study, we consider the one-class classifier fusion problem by modelling the sparsity/uniformity of the ensemble. To this end, we formulate a convex objective function to learn the weights in a linear ensemble model and impose a variable l(p >= 1)-norm constraint on the weight vector. The vector-norm constraint enables the model to adapt to the intrinsic uniformity/sparsity of the ensemble in the space of base learners and acts as a (soft) classifier selection mechanism by shaping the relative magnitudes of fusion weights. Drawing on the Frank-Wolfe algorithm, we then present an effective approach to solve the proposed convex constrained optimisation problem efficiently. We evaluate the proposed one-class classifier combination approach on multiple data sets from diverse application domains and illustrate its merits in comparison to the existing approaches.EnglishClassifier combinationOne-class classificationSparsity modellingl(p)-norm constraintConvex optimisationlp-norm constrained one-class classifier combinationArticle10.1016/j.inffus.2024.1027001872-6305