Confusion Matrix

The sklearn.metrics module allows for the plotting of a confusion matrix from a classifier (with sklearn.metrics.plot_confusion_matrix(), or directly from a pre-computed confusion matrix (with the internal sklearn.metrics.ConfusionMatrixDisplay class).

A confusion matrix shows the discrepancy between the true labels of a dataset and the labels predicted by a classifier.

While the confusion matrix plots generated by Scikit-Learn are very informative, they omit important evaluation measures that can summarize classification performance. True positives, precision, F1 score and accuracy are example of such measures – all of which can be derived from the confusion matrix. The sklearn.metrics.classification_report() function in the same module provides these measures.

Daze adjusts plot_confusion_matrix to incorporate these evaluation measures directly in the confusion matrix plot, while still maintaining a very similar API to the original Scikit-Learn function.

Documentation Search and Index