Rahimzadeh Arashloo, Shervin2025-02-232025-02-232023-12-0900313203https://hdl.handle.net/11693/116666Using a variable 𝓁𝑝≥1-norm penalty on the slacks, the recently introduced 𝓁𝑝-norm Support Vector Data Description (𝓁𝑝-SVDD) method has improved the performance in novelty detection over the baseline approach, sometimes remarkably. This work extends this modelling formalism in multiple aspects. First, a large-margin extension of the 𝓁𝑝-SVDD method is formulated to enhance generalisation capability by maximising the margin between the positive and negative samples. Second, based on the Frank–Wolfe algorithm, an efficient yet effective method with predictable accuracy is presented to optimise the convex objective function in the proposed method. Finally, it is illustrated that the proposed approach can effectively benefit from a multiple kernel learning scheme to achieve state-of-the-art performance. The proposed method is theoretically analysed using Rademacher complexities to link its classification error probability to the margin and experimentally evaluated on several datasets to demonstrate its merits against existing methods.EnglishCC BY 4.0 (Attribution 4.0 International Deed)https://creativecommons.org/licenses/by/4.0/𝓁𝑝-SVDDLarge-margin learningConvex optimisationFrank–Wolfe algorithmMultiple kernel learningImbalanced classificationNovelty detectionLarge-margin multiple kernel ℓp-SVDD using Frank–Wolfe algorithm for novelty detectionArticle10.1016/j.patcog.2023.110189