toIPA
Home
Blog
Category
Cross entropy
in information theory, given two probability distributions, the average number of bits needed to identify an event if the coding scheme is optimized for the ‘wrong’ probability distribution rather than the true distribution
Pronunciation
/krɔs ˈɛntrəpi/
/krɒs ˈɛntrəpi/
Categories
mathematical object