Buy Me a Coffee at ko-fi.com

Cross entropy

in information theory, given two probability distributions, the average number of bits needed to identify an event if the coding scheme is optimized for the ‘wrong’ probability distribution rather than the true distribution

Pronunciation
/krɒs ˈɛntrəpi/
/krɔs ˈɛntrəpi/
Categories