# Information Entropy > [!summary] Definition of Information Entropy > > - The average rate at which information is produced by a stochastic source of > data, or > - The average amount of information conveyed by an event, when considering all > possible outcomes. > > $ > H(Y) = -\sum_{i=1}^m p_i\lg(p_i),\quad > p(i) = P(Y = y_i) > $ - Conditional entropy: with conditional probability