| ||||

`Parent Node(s):`

H(A) = -SUM_a P_a log_2 P_a

The quantity is zero when all events are of the same kind, p =1 for any one a of A and is positive otherwise. Its upper limit is log_2 N where N is the number of categories available (*see* degrees of freedom) and the distribution is uniform over these, p = 1/N for all a of A (*see* variety, uncertainty, negentropy). The statistical entropy measure is the most basic measure of information theory. (Krippendorff)

* Next | * Previous | * Index | * Search | * Help |

URL= http://cleamc11.vub.ac.be/ASC/STATIS_ENTRO.html