We give an introduction to the concept of relative entropy (also called Kullback-Leibler divergence). We interpret relative entropy in terms of both coding and diversity, and sketch some connections with other subjects: Riemannian geometry (where relative entropy is infinitesimally a squared distance), measure theory, and statistics. We prove that relative entropy is uniquely characterized by a short list of properties.
Daniel L. JafferisAitor LewkowyczJuan MaldacenaS. Josephine Suh