# Cross-entropy

## Information-theoretic measure / From Wikipedia, the free encyclopedia

#### Dear Wikiwand AI, let's keep it short by simply answering these key questions:

Can you list the top facts and stats about Cross entropy?

Summarize this article for a 10 year old

SHOW ALL QUESTIONS

In information theory, the **cross-entropy** between two probability distributions $p$ and $q$ over the same underlying set of events measures the average number of bits needed to identify an event drawn from the set if a coding scheme used for the set is optimized for an estimated probability distribution $q$, rather than the true distribution $p$.