In mathematical statistics, the Kullback–Leibler divergence, KL-Divergence (also called relative entropy), is a measure of how one probability distribution is different from a second, reference probability distribution.