polecić Fantastyczny Wybacz kappa in confusion matrix kość policzkowa O ustawieniach Aukcja
Cohen's Kappa: What It Is, When to Use It, and How to Avoid Its Pitfalls - The New Stack
Metrics to evaluate classification models with R codes: Confusion Matrix, Sensitivity, Specificity, Cohen's Kappa Value, Mcnemar's Test - Data Science Vidhya
Week 6: Diagnostic Metrics: Kappa and Accuracy - YouTube
regression - How to calculate information included in R's confusion matrix - Cross Validated
Accuracy Metrics
Cohen's Kappa: What It Is, When to Use It, and How to Avoid Its Pitfalls - The New Stack
GitHub - habernal/confusion-matrix: Minimalistic Java implementation of a confusion matrix for evaluating learning algorithms, including accuracy, macro F-measure, Cohen's Kappa, and probabilistic confusion matrix