Metrics - niranjv/ml-notes GitHub Wiki
Sensitivity
/ Recall
/ True Positive Rate
Measure of the probability that the estimate is 1 given all samples whose true class label is 1, i.e., how many of the +ve samples were identified as being +ve (predicted & true +ve / all true +ve).
Specificity
/ True Negative Rate
Measure of the probability that the estimate is 0 given all samples whose true class label is 0, i.e., how many of the -ve samples were identified as being -ve (predicted & true -ve / all true -ve)
False Positive Rate
= 1 - Specificity
= (predicted & false -ve / all true -ve)
Precision
Measure of the probability that a sample is true positive given that the classifier has said it is positive, how many sample predicted by the classified as positive as actually positive
ROC curve
- X-axis =
False Positive Rate
(= 1 -Specificity
) - Y-axis =
True Positive Rate
Precision-Recall curve
- X-axis =
Precision
- Y-axis =
Recall
Use when +ve samples are very small compared to negative samples (highly imbalanced classes in samples)