« Back to Glossary Index

Given a Machine Learning model, the bias–variance tradeoff describes the relationship between the accuracy of its predictions, and how well it can make predictions on previously unseen data that were not used to train the model.

  • Bias is the error between the average model prediction and the ground truth, so indicates the capability of the underlying model to predict the correct values.
  • Variance is the average variability in the model prediction for the given dataset and indicates how far model’s predictions are spread out from their average value, hence it measures how much the model can adjust to the change in the dataset. Lower variance means a better capability of adjustment and high variance means a lower capability.

    A model having high bias is likely underfitting whereas a model with high variance is likely overfitting: clearly it would be ideal to have a model with low bias and low variance, i.e. balancing and minimising both type of errors, bias and variance.