ConfusionMatrix - smart1004/doc GitHub Wiki

Convert scikit-learn confusion matrix to pandas DataFrame ***** https://gist.github.com/nickynicolson/202fe765c99af49acb20ea9f77b6255e

@ Pretty print for sklearn confusion matrix
https://gist.github.com/zachguo/10296432

=====

def confusion_matrix(y_true=None, y_pred=None, labels=None): ''' Dataframe of confusion matrix. Rows are actual, and columns are predicted.

Parameters
----------
y_true : array
y_pred : array
labels : list-like

Returns
-------
confusion_matrix : DataFrame
'''
df = (pd.DataFrame(metrics.confusion_matrix(y_true, y_pred),
                   index=labels, columns=labels)
        .rename_axis("actual")
        .rename_axis("predicted", axis=1))
return df 

=========== ๋ฉ€ํ‹ฐ ํด๋ž˜์Šค
https://docs.aws.amazon.com/ko_kr/machine-learning/latest/dg/multiclass-classification.html

https://docs.aws.amazon.com/ko_kr/machine-learning/latest/dg/machinelearning-dg.pdf#about-batch-predictions

3๊ฐœ์˜ ํด๋ž˜์Šค์— ๋Œ€ํ•ด ์„ค๋ช…์„ ํ•˜๊ณ  ์žˆ๋‹ค http://nate9389.tistory.com/1138

2.3. ํ˜ผ๋™ ํ–‰๋ ฌ (confusion matrix) 2.3.1. 7์€ C1์œผ๋กœ ๋ถ„๋ฅ˜๋ผ์•ผ ํ•  ๋ฌธ์ œ์—์„œ C1์œผ๋กœ ์˜ˆ์ธกํ•œ ํšŸ์ˆ˜๋ฅผ ๋‚˜ํƒ€๋ƒ„ 2.3.2. ์ •ํ™•๋„ (Accuracy) := (7+8+9) รท (1+2+3+4+5+6+7+8+9) = 24/45 2.4. ์ •ํ™•๋„ ์ง€ํ‘œ 2.4.1. ์ฐธ ๊ธ์ • (True Positive) #TP = 7+8+9 = 24 2.4.2. ๊ฑฐ์ง“ ๊ธ์ • (False Positive) #FP = 1+2+3+4+5+6 = 21 2.4.3. ์ฐธ ๋ถ€์ • (True Negative) #TN : ๋ถ€์ • ์˜ˆ์ œ์— ๋Œ€ํ•ด ์˜ฌ๋ฐ”๋ฅด๊ฒŒ ๋‹ค๋ฅธ ํด๋ž˜์Šค๋ผ๊ณ  ์˜ˆ์ƒํ•œ ๊ฒฝ์šฐ 2.4.4. ๊ฑฐ์ง“ ๋ถ€์ • (False Negative) #FN : ๋ถ€์ • ์˜ˆ์ œ์— ๋Œ€ํ•ด ๊ฐ™์€ ํด๋ž˜์Šค๋ผ๊ณ  ์ž˜๋ชป ์˜ˆ์ƒํ•œ ๊ฒฝ์šฐ 2.4.5. ์ •ํ™•๋„ (Accuracy) =(#TP + #TN) รท (#TP + #FP + #TN + #FN) 2.4.6. ๋ฏผ๊ฐ๋„ (Sensitivity) = #TP รท (#TP + #FN) 2.4.7. ํŠน์ด๋„ (Specificity) = #TN รท (#TN + #FP) 2.4.8. ์ •๋ฐ€๋„ (Precision) = #TP รท (#TP + #FP) 2.4.9. ์žฌํ˜„์œจ (Recall) = #TP รท (#TP + #FN) 2.4.10. F1 = 2 ร— ์ •๋ฐ€๋„ ร— ์žฌํ˜„์œจ รท (์ •๋ฐ€๋„ + ์žฌํ˜„์œจ) = #TP รท [ #TP + (#FN + #FP)/2 ] 2.5. ์ˆ˜์‹ ์ž ์กฐ์ž‘ ํŠน์„ฑ ๊ณก์„  2.6. ๋ถˆ๊ท ํ˜• ๋ฐ์ดํ„ฐ์„ธํŠธ 2.7. ์ •๋ฐ€๋„, ์ง„์‹ค๋„

class specific recall
https://www.researchgate.net/figure/Population-cumulative-confusion-matrix-showing-classification-accuracy-for-the-best_fig4_228812391

ํ•œ๊ธ€ ์ข‹๋‹ค http://bcho.tistory.com/1206

๋ถ„๋ฅ˜ ๊ฒฐ๊ณผํ‘œ Confusion Matrix
https://datascienceschool.net/view-notebook/731e0d2ef52c41c686ba53dcaf346f32/

http://scikit-learn.org/stable/auto_examples/model_selection/plot_confusion_matrix.html#sphx-glr-auto-examples-model-selection-plot-confusion-matrix-py

https://www.quora.com/How-can-I-create-a-confusion-matrix-in-MS-Excel
Use =COUNTIFS. Example Predictions in column B and truths in column C. Create a 3x3 table in E1:G3. Truth positive value in E2, truth negative in E3. Prediction positive in F1 and prediction negative in G1. Top-left cell of the matrix (cell F2) would be =COUNTIFS ($B:$B,1,$C:$C,1) where 1 represents positive in a data point that is a true-positive. You maybe have TRUE or โ€œYesโ€ or something else so change 1 to that on your formula. Bottom-left, =COUNTIFS ($B:$B,1,$C:$C,0). And I'll let you work out the rest!