ConfusionMatrix - smart1004/doc GitHub Wiki
Convert scikit-learn confusion matrix to pandas DataFrame ***** https://gist.github.com/nickynicolson/202fe765c99af49acb20ea9f77b6255e
@ Pretty print for sklearn confusion matrix
https://gist.github.com/zachguo/10296432
=====
def confusion_matrix(y_true=None, y_pred=None, labels=None): ''' Dataframe of confusion matrix. Rows are actual, and columns are predicted.
Parameters
----------
y_true : array
y_pred : array
labels : list-like
Returns
-------
confusion_matrix : DataFrame
'''
df = (pd.DataFrame(metrics.confusion_matrix(y_true, y_pred),
index=labels, columns=labels)
.rename_axis("actual")
.rename_axis("predicted", axis=1))
return df
===========
๋ฉํฐ ํด๋์ค
https://docs.aws.amazon.com/ko_kr/machine-learning/latest/dg/multiclass-classification.html
3๊ฐ์ ํด๋์ค์ ๋ํด ์ค๋ช ์ ํ๊ณ ์๋ค http://nate9389.tistory.com/1138
2.3. ํผ๋ ํ๋ ฌ (confusion matrix) 2.3.1. 7์ C1์ผ๋ก ๋ถ๋ฅ๋ผ์ผ ํ ๋ฌธ์ ์์ C1์ผ๋ก ์์ธกํ ํ์๋ฅผ ๋ํ๋ 2.3.2. ์ ํ๋ (Accuracy) := (7+8+9) รท (1+2+3+4+5+6+7+8+9) = 24/45 2.4. ์ ํ๋ ์งํ 2.4.1. ์ฐธ ๊ธ์ (True Positive) #TP = 7+8+9 = 24 2.4.2. ๊ฑฐ์ง ๊ธ์ (False Positive) #FP = 1+2+3+4+5+6 = 21 2.4.3. ์ฐธ ๋ถ์ (True Negative) #TN : ๋ถ์ ์์ ์ ๋ํด ์ฌ๋ฐ๋ฅด๊ฒ ๋ค๋ฅธ ํด๋์ค๋ผ๊ณ ์์ํ ๊ฒฝ์ฐ 2.4.4. ๊ฑฐ์ง ๋ถ์ (False Negative) #FN : ๋ถ์ ์์ ์ ๋ํด ๊ฐ์ ํด๋์ค๋ผ๊ณ ์๋ชป ์์ํ ๊ฒฝ์ฐ 2.4.5. ์ ํ๋ (Accuracy) =(#TP + #TN) รท (#TP + #FP + #TN + #FN) 2.4.6. ๋ฏผ๊ฐ๋ (Sensitivity) = #TP รท (#TP + #FN) 2.4.7. ํน์ด๋ (Specificity) = #TN รท (#TN + #FP) 2.4.8. ์ ๋ฐ๋ (Precision) = #TP รท (#TP + #FP) 2.4.9. ์ฌํ์จ (Recall) = #TP รท (#TP + #FN) 2.4.10. F1 = 2 ร ์ ๋ฐ๋ ร ์ฌํ์จ รท (์ ๋ฐ๋ + ์ฌํ์จ) = #TP รท [ #TP + (#FN + #FP)/2 ] 2.5. ์์ ์ ์กฐ์ ํน์ฑ ๊ณก์ 2.6. ๋ถ๊ท ํ ๋ฐ์ดํฐ์ธํธ 2.7. ์ ๋ฐ๋, ์ง์ค๋
class specific recall
https://www.researchgate.net/figure/Population-cumulative-confusion-matrix-showing-classification-accuracy-for-the-best_fig4_228812391
ํ๊ธ ์ข๋ค http://bcho.tistory.com/1206
๋ถ๋ฅ ๊ฒฐ๊ณผํ Confusion Matrix
https://datascienceschool.net/view-notebook/731e0d2ef52c41c686ba53dcaf346f32/
https://www.quora.com/How-can-I-create-a-confusion-matrix-in-MS-Excel
Use =COUNTIFS.
Example
Predictions in column B and truths in column C.
Create a 3x3 table in E1:G3. Truth positive value in E2, truth negative in E3. Prediction positive in F1 and prediction negative in G1.
Top-left cell of the matrix (cell F2) would be =COUNTIFS ($B:$B,1,$C:$C,1) where 1 represents positive in a data point that is a true-positive. You maybe have TRUE or โYesโ or something else so change 1 to that on your formula. Bottom-left, =COUNTIFS ($B:$B,1,$C:$C,0). And I'll let you work out the rest!