Confusion Matrix and it's 25 offspring: or the link between machine learning and epidemiology | Dr. Yury Zablotski
Importance of Mathews Correlation Coefficient & Cohen's Kappa for Imbalanced Classes | by Sarit Maitra | Medium
Cohen's Kappa: What it is, when to use it, and how to avoid its pitfalls | by Rosaria Silipo | Towards Data Science
Multi-Class Metrics Made Simple, Part III: the Kappa Score (aka Cohen's Kappa Coefficient) | by Boaz Shmueli | Towards Data Science
Cohen's Kappa: what it is, when to use it, how to avoid pitfalls | KNIME
Top 15 Evaluation Metrics for Machine Learning with Examples
Confusion Matrix - an overview | ScienceDirect Topics
Metrics to evaluate classification models with R codes: Confusion Matrix, Sensitivity, Specificity, Cohen's Kappa Value, Mcnemar's Test - Data Science Vidhya
The accuracy and Cohen's kappa of the confusion matrix example for the... | Download Table
What is Kappa in a confusion matrix? - Quora
Rizal Fathony Twitterren: "6/ Our framework supports a wide variety of non-decomposable performance metrics that can be expressed as a sum of fractions over the entities in the confusion matrix. This includes
Metrics for Multi-Class Classification: an Overview – arXiv Vanity
Evaluation Metrics in Machine Learning Models using Python | by Manoj Singh | Analytics Vidhya | Medium
24 Evaluation Metrics for Binary Classification (And When to Use Them) - neptune.ai
Multi-Class Metrics Made Simple, Part III: the Kappa Score (aka Cohen's Kappa Coefficient) | by Boaz Shmueli | Towards Data Science
Accuracy Metrics
Why Cohen's Kappa should be avoided as performance measure in classification