Calculate the entropy of a distribution for given probability values.
If only probabilities pk are given, the entropy is calculated as S = -sum(pk * log(pk), axis=0).
If qk is not None, then compute a relative entropy (also known as Kullback-Leibler divergence or Kullback-Leibler distance) S = sum(pk * log(pk / qk), axis=0).
This routine will normalize pk and qk if they don’t sum to 1.
Parameters: | pk : sequence
qk : sequence, optional
base : float, optional
|
---|---|
Returns: | S : float
|