Abstract:
One-class classification (OCC) is a supervised classification problem where the training data is solely one class. OCC cannot execute hyperparameter tuning because its evaluation requires access to other classes; the algorithm will no longer be OCC if the model is updated after accessing other classes. To address this issue, this paper proposes hyperparameter fusion, which is applicable without the evaluation. The fusion process applies ensemble learning techniques, voting, and stacking into OCC models trained on different hyperparameters. The experiments involve 54 OCC problems from 27 imbalanced learn datasets and 115 hyperparameter candidates. The experiment results show that hyperparameter fusion outperformed the average base learners in the area under the receiver operating characteristic (AUC) score. Moreover, removing the worst base learner can improve the AUC score for the ensemble. The discussion section predicts the worst base learner from correlations of normality rankings created by model outputs. The worst base learner has relatively small ranking correlations to the ensemble model compared to other base learners.