A Crash Course of Model Calibration - Part 1
How to make ML models reflect true probabilities in their predictions?
Today, I am starting a two-article series on model calibration.
The first part of this series is available as a deep dive here: A Crash Course of Model Calibration – Part 1.
Motivation
A model is perfectly calibrated if the predicted probabilities of outcomes align closely with the actual outcomes.
For instance, if a model predicts an event with a 70% probability, then ideally, out of 100 such predictions, approximately 70 should result in the event occurring.
However, many experiments have revealed that modern neural networks are no longer well-calibrated.
For instance, consider the following plot, which compares a LeNet (developed in 1998) with a ResNet (developed in 2016) on the CIFAR-100 dataset.
From the above plot, it is clear that:
The average confidence of LeNet (an old model) closely matches its accuracy.
In contrast, the average confidence of the ResNet (a relatively modern model) is substantially higher than its accuracy.
However, we can see that the LeNet model is well-calibrated since its confidence closely matches the expected accuracy. However, ResNet’s accuracy is better (0.7>0.5), but the accuracy does not match its confidence.
To put it another way, it means that despite being more accurate overall, the ResNet model is overconfident in its predictions.
Handling this is important because these systems are actually used in downstream decision-making.
An overly confident but not equally accurate model can be disastrous.
Thus, in this two-part article series, we shall:
dive into the details of model calibration
why it is a problem
why modern models are miscalibrating more
techniques to determine miscalibration and their limitations.
techniques to address miscalibration.
and more.
Hoping you aspire to make valuable contributions to your data science job, this series will be super helpful in cultivating a diversified skill set.
Read the first part here: A Crash Course of Model Calibration – Part 1.
Have a good day!