BOOK-CHAPTER

Relative entropy

Yair Neuman

Year: 2024 Edward Elgar Publishing eBooks Pages: 426-429   Publisher: Edward Elgar Publishing

Abstract

Shannon's entropy measure quantifies our ignorance of a system in terms of surprise and probability. The measure of relative entropy, or Kullback-Leibler divergence, quantifies our surprise when trying to code a posterior probability distribution P when using a code derived from a prior probability distribution Q. This measure is relevant for interdisciplinary research where crossing disciplinary boundaries requires methodological and conceptual bridges. I present three constructive usages of this measure in linguistics (e.g., the measure of semantic transparency in linguistic compounds), sport (i.e., modeling the behavior of a soccer team), and the study of human interactions (i.e., identifying significant romantic relations in a successful TV series) and conclude by pointing to further usages in interdisciplinary research.

Keywords:
Mathematics

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
0
Refs
0.31
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Advanced Thermodynamics and Statistical Mechanics
Physical Sciences →  Physics and Astronomy →  Statistical and Nonlinear Physics
Statistical Mechanics and Entropy
Physical Sciences →  Physics and Astronomy →  Statistical and Nonlinear Physics

Related Documents

JOURNAL ARTICLE

Relative entropy equals bulk relative entropy

Daniel L. JafferisAitor LewkowyczJuan MaldacenaS. Josephine Suh

Journal:   Journal of High Energy Physics Year: 2016 Vol: 2016 (6)
BOOK-CHAPTER

Relative Entropy

Tom Leinster

Cambridge University Press eBooks Year: 2021 Pages: 62-90
BOOK-CHAPTER

Relative Entropy

Year: 2006 Pages: 15-31
BOOK-CHAPTER

Relative Entropy

Robert M. Gray

Year: 2011 Pages: 173-218
BOOK-CHAPTER

relative entropy

Martin H. Weik

Year: 2000 Pages: 1458-1458
© 2026 ScienceGate Book Chapters — All rights reserved.