In R, you can calculate Precision, Recall (also known as Sensitivity), and the F1-Score using the results from a confusion matrix. This is commonly done in the context of classification tasks. Here's how you can do it:
First, you need a confusion matrix, which you can generate using the table() function or specific functions from packages like caret.
Assume you have the actual and predicted classifications:
actual <- factor(c('yes', 'no', 'no', 'yes', 'yes', 'no', 'yes', 'no', 'no', 'yes')) predicted <- factor(c('yes', 'yes', 'no', 'no', 'yes', 'yes', 'yes', 'no', 'no', 'no')) conf_matrix <- table(Predicted = predicted, Actual = actual) Precision and Recall can be calculated as follows:
TP / (TP + FP)TP / (TP + FN)2 * (Precision * Recall) / (Precision + Recall)Where TP is True Positives, FP is False Positives, and FN is False Negatives.
TP <- conf_matrix[2, 2] # Assuming 'yes' is the positive class FP <- conf_matrix[2, 1] FN <- conf_matrix[1, 2] precision <- TP / (TP + FP) recall <- TP / (TP + FN) f1_score <- 2 * (precision * recall) / (precision + recall)
cat("Precision:", precision, "\n") cat("Recall:", recall, "\n") cat("F1 Score:", f1_score, "\n") caret, e1071, and Metrics that offer functions to calculate these metrics more conveniently and with additional functionality.If you're using the caret package, it becomes quite straightforward:
library(caret) conf_matrix <- confusionMatrix(predicted, actual) cat("Precision:", conf_matrix$byClass['Pos Pred Value'], "\n") cat("Recall:", conf_matrix$byClass['Sensitivity'], "\n") cat("F1 Score:", conf_matrix$byClass['F1'], "\n") In this example, confusionMatrix() from the caret package calculates a variety of metrics, including Precision, Recall, and F1 Score.
powershell-4.0 kotlin python-extensions kendo-chart pytz pygame-clock configparser nsarray webstorm docker-volume