Arashloo, Shervin Rahimzadeh2023-02-152023-02-152022-07-230031-3203http://hdl.handle.net/11693/111326The support vector data description (SVDD) approach serves as a de facto standard for one-class classification where the learning task entails inferring the smallest hyper-sphere to enclose target objects while linearly penalising the errors/slacks via an ℓ1-norm penalty term. In this study, we generalise this modelling formalism to a general ℓp-norm (p ≥ 1) penalty function on slacks. By virtue of an ℓp-norm function, in the primal space, the proposed approach enables formulating a non-linear cost for slacks. From a dual problem perspective, the proposed method introduces a dual norm into the objective function, thus, proving a controlling mechanism to tune into the intrinsic sparsity/uniformity of the problem for enhanced descriptive capability. A theoretical analysis based on Rademacher complexities characterises the generalisation performance of the proposed approach while the experimental results on several datasets confirm the merits of the proposed method compared to other alternatives.EnglishOne-class classificationKernel methodsSupport vector data descriptionℓp -norm penaltyℓp-norm support vector data descriptionArticle10.1016/j.patcog.2022.1089301873-5142