One of the most intuitive explanation for information entropy, cross entropy and relative entropy (KL-divergence). Highly recommend :)

https://www.youtube.com/watch?v=ErfnhcEV1O8

Notes