Top Qs
Timeline
Chat
Perspective

conditional entropy

From Wiktionary, the free dictionary

Remove ads

English

English Wikipedia has an article on:
Wikipedia

Noun

conditional entropy (plural conditional entropies)

  1. (information theory) The portion of a random variable's own Shannon entropy which is independent from another, given, random variable.
    The conditional entropy of random variable given (i.e., conditioned by ), denoted as , is equal to where is the mutual information between and .
Remove ads

Wikiwand - on

Seamless Wikipedia browsing. On steroids.

Remove ads