Shannon's entropy measure quantifies our ignorance of a system in terms of surprise and probability. The measure of relative entropy, or Kullback-Leibler divergence, quantifies our surprise when trying to code a posterior probability distribution P when using a code derived from a prior probability distribution Q. This measure is relevant for interdisciplinary research where crossing disciplinary boundaries requires methodological and conceptual bridges. I present three constructive usages of this measure in linguistics (e.g., the measure of semantic transparency in linguistic compounds), sport (i.e., modeling the behavior of a soccer team), and the study of human interactions (i.e., identifying significant romantic relations in a successful TV series) and conclude by pointing to further usages in interdisciplinary research.
Daniel L. JafferisAitor LewkowyczJuan MaldacenaS. Josephine Suh