Linear Digressions
KL Divergence
- Autor: Vários
- Narrador: Vários
- Editor: Podcast
- Duración: 0:25:38
- Mas informaciones
Informações:
Sinopsis
Kullback Leibler divergence, or KL divergence, is a measure of information loss when you try to approximate one distribution with another distribution. It comes to us originally from information theory, but today underpins other, more machine-learning-focused algorithms like t-SNE. And boy oh boy can it be tough to explain. But we're trying our hardest in this episode!