Search results
We did not find results for: information entropy definition.
Check spelling or type a new query.
We did not find results for: information entropy definition.
Check spelling or type a new query.
In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable X {\displaystyle X}, which takes values in the set X {\displaystyle {\mathcal {X}}} and is distributed according to p : X → [ 0, 1 ] {\displaystyle p\colon {\mathcal {X}}\to [0,1]}, the entropy is ... Wikipedia