Digital Library

cab1

 
Title:      CLASSIFIER RANK - A NEW CLASSIFICATION ASSESSMENT METHOD
Author(s):      Ningsheng Zhao, Jia Yuan Yu and Krzysztof Dzieciolowski
ISBN:      978-989-8704-42-9
Editors:      Yingcai Xiao, Ajith Abraham, Guo Chao Peng and Jörg Roth
Year:      2022
Edition:      Single
Keywords:      Confusion Matrix, Class Imbalance, Performance Metrics, Graphical Inference, Classifier Rank
Type:      Short Paper
First Page:      233
Last Page:      237
Language:      English
Cover:      cover          
Full Contents:      click to dowload Download
Paper Abstract:      Most of the commonly used confusion matrix-based classification performance metrics, such as f1_score, MCC, and PRC, are sensitive to the class imbalance. To address this problem, we propose a novel classifier evaluation method, called classifier rank which provides the rank of the classifier in the space of all possible classifiers. To rank a classifier, we find the distribution of performance metrics conditional on arbitrary class ratio. However, some metrics like PRC are functions of a large sequence of confusion matrices whose joint distribution is difficult to estimate. Hence, we propose a directed binary tree model to effectively represent this large-scale joint distribution. As a result, we can estimate the classifier rank using graphical inference algorithms, such as Monte-Carlo algorithm.
   

Social Media Links

Search

Login