Browsing by Subject "Convex optimisation"
Now showing 1 - 2 of 2
- Results Per Page
- Sort Options
Item Embargo Large-margin multiple kernel ℓp-SVDD using Frank–Wolfe algorithm for novelty detection(Elsevier BV, 2023-12-09) Rahimzadeh Arashloo, ShervinUsing a variable 𝓁𝑝≥1-norm penalty on the slacks, the recently introduced 𝓁𝑝-norm Support Vector Data Description (𝓁𝑝-SVDD) method has improved the performance in novelty detection over the baseline approach, sometimes remarkably. This work extends this modelling formalism in multiple aspects. First, a large-margin extension of the 𝓁𝑝-SVDD method is formulated to enhance generalisation capability by maximising the margin between the positive and negative samples. Second, based on the Frank–Wolfe algorithm, an efficient yet effective method with predictable accuracy is presented to optimise the convex objective function in the proposed method. Finally, it is illustrated that the proposed approach can effectively benefit from a multiple kernel learning scheme to achieve state-of-the-art performance. The proposed method is theoretically analysed using Rademacher complexities to link its classification error probability to the margin and experimentally evaluated on several datasets to demonstrate its merits against existing methods.Item Embargo lp-norm constrained one-class classifier combination(Elsevier BV, 2024-10-16) Nourmohammadi, Sepehr; Rahimzadeh Arashloo, Shervin; Kittler, JosefClassifier fusion is established as an effective methodology for boosting performance in different classification settings and one-class classification is no exception. In this study, we consider the one-class classifier fusion problem by modelling the sparsity/uniformity of the ensemble. To this end, we formulate a convex objective function to learn the weights in a linear ensemble model and impose a variable l(p >= 1)-norm constraint on the weight vector. The vector-norm constraint enables the model to adapt to the intrinsic uniformity/sparsity of the ensemble in the space of base learners and acts as a (soft) classifier selection mechanism by shaping the relative magnitudes of fusion weights. Drawing on the Frank-Wolfe algorithm, we then present an effective approach to solve the proposed convex constrained optimisation problem efficiently. We evaluate the proposed one-class classifier combination approach on multiple data sets from diverse application domains and illustrate its merits in comparison to the existing approaches.