That is, Shannon entropy of a distribution can be understood as the KL divergence of that distribution relative to a uniform prior distribution ...
確定! 回上一頁