confusion matrix, accuracy, precision, recall, f1 score - yarak001/machine_learning_common GitHub Wiki
confusion matrix, accuracy, precision, recall, f1-score
- confusion matrix
- fbeta score
- example
- example
- Cat
- TP:10, FN:15, FP:13, TN:43
- precision: TP / TP + FP = 10 / 10 + 13 = 0.43
- recall: TP / TP + FN = 10 / 10 + 15 = 0.4
- f1-score: 2*(precision * recall) / (precision + recall) = 0.414
- Doc
- TP:20, FN:10, FP:12, TN:38
- precision: TP / TP + FP = 20 / 20 + 12 = 0.606
- recall: TP / TP + FN = 20 / 20 + 10 = 0.666
- f1-score: 2*(precision * recall) / (precision + recall) = 0.634
- Fish
- TP:13, FN:12, FP:13, TN:43
- precision: TP / TP + FP = 13 / 13 + 13 = 0.5
- recall: TP / TP + FN = 13 / 13 + 12 = 0.52
- f1-score: 2*(precision * recall) / (precision + recall) = 0.509
- macro average f1-score
- ํ๊ท ๋ค์ ํ๊ท ๊ฐ๋
- ์ ์ฒด ์ฑ๋ฅ์ ํ๊ฐํ๊ธฐ ์ํด ๋ชจ๋ class์ ๋์ผํ ๊ฐ์ค์น ๋ถ์ฌ
- Sample์ ๋ํด ๋์ผํ weight๋ฅผ ์ฃผ๋ ๊ฒ์ด ์๋, Class ๋ณ๋ก ๋๊ฐ์ weight์ ๋ถ์ฌํจ
- sum(๊ฐ class์ f1-score) / class์ = 0.519
- micro average f1-score
- ์ ์ฒด์ ๊ฐ๋ค ํ๊ท ๊ฐ๋
- ๊ฐ sample์ ๋์ผํ ๊ฐ์ค์น๋ฅผ ์ ์ฉํ ๋ ์ ์ฉ
- ๊ฐ class์ TP, FN, FP์ ํฉ์ฐ ๊ฐ์ ๋ํ precision, recall, f1-score ๊ฒ์ฐ
- precision = TPs / TPs + FPs = (10+20+13) / (10+20+13) + (13+12+13) = 0.53
- recall = TPs / TPs + FNs = (10+20+13) / (10+20+13) + (15+10+12) = 0.537
- fi-score = 0.533
- weighted macro-average
- ์ค์ instance์ ์์ ๋ฐ๋ผ ๊ฐ class์ ์ ์์ ๊ฐ์ค์น๋ฅผ ๋ถ์ฌํ๋ ๊ณ์ฐ ๋ฐฉ์
- class ๋ถ๊ท ํ์ ๋ค๋ฃฐ ๋ ์ ์ฉ
์ฐธ๊ณ