July 11, 2019
4:00PM
-
5:00PM
Scott Lab N050
Add to Calendar
2019-07-11 15:00:00
2019-07-11 16:00:00
What is...? Seminar - Ethan Ackelsberg
Title: What is Shannon Entropy?
Speaker: Ethan Ackelsberg (Ohio State University)
Abstract: Claude Shannon, in his 1948 paper, "A Mathematical Theory of Communication,'' defined the entropy of a random variable $X$ taking the values $x_1, \dots, x_n$ with the probabilities $p_1, \dots, p_n$, respectively, as $$H(X) = - \sum_{i=1}^n{p_i \log{p_i}}$$ This naturally quantifies the uncertainty of a random variable. We will show how it can also be understood as a measure of ``information.'' Finally, we will discuss connections to other notions of entropy, particularly those used in graph theory and ergodic theory.
Seminar URL: http://math.osu.edu/whatis
Scott Lab N050
OSU ASC Drupal 8
ascwebservices@osu.edu
America/New_York
public
Date Range
2019-07-11 16:00:00
2019-07-11 17:00:00
What is...? Seminar - Ethan Ackelsberg
Title: What is Shannon Entropy?
Speaker: Ethan Ackelsberg (Ohio State University)
Abstract: Claude Shannon, in his 1948 paper, "A Mathematical Theory of Communication,'' defined the entropy of a random variable $X$ taking the values $x_1, \dots, x_n$ with the probabilities $p_1, \dots, p_n$, respectively, as $$H(X) = - \sum_{i=1}^n{p_i \log{p_i}}$$ This naturally quantifies the uncertainty of a random variable. We will show how it can also be understood as a measure of ``information.'' Finally, we will discuss connections to other notions of entropy, particularly those used in graph theory and ergodic theory.
Seminar URL: http://math.osu.edu/whatis
Scott Lab N050
America/New_York
public
Title: What is Shannon Entropy?
Speaker: Ethan Ackelsberg (Ohio State University)
Abstract: Claude Shannon, in his 1948 paper, "A Mathematical Theory of Communication,'' defined the entropy of a random variable $X$ taking the values $x_1, \dots, x_n$ with the probabilities $p_1, \dots, p_n$, respectively, as $$H(X) = - \sum_{i=1}^n{p_i \log{p_i}}$$ This naturally quantifies the uncertainty of a random variable. We will show how it can also be understood as a measure of ``information.'' Finally, we will discuss connections to other notions of entropy, particularly those used in graph theory and ergodic theory.
Seminar URL: http://math.osu.edu/whatis