Kullback-Leibler发散相同的图像 - scipy.stats.entropy

时间:2014-09-17 21:37:45

标签: python statistics scipy

我在计算3幅图像直方图之间的kl距离:

import numpy as np
import scipy.misc
from skimage.io import ImageCollection, imread
from skimage import color
import skimage
from sklearn.datasets import load_sample_image

# all images in grayscale
lena = scipy.misc.lena().astype('uint8')
china = skimage.img_as_ubyte(color.rgb2grey( load_sample_image("china.jpg")) )
flower = skimage.img_as_ubyte(color.rgb2grey( load_sample_image("flower.jpg")) )

# histograms for all images
hist_lena, bin_edges_lena = np.histogram(lena, bins = range(256))
hist_china, bin_edges_china = np.histogram(china, bins = range(256))
hist_flower, bin_edges_flower = np.histogram(flower, bins = range(256))

当我使用scipy.stats.entropy比较相同的图像时,我得到了不同的结果:

# http://docs.scipy.org/doc/scipy-dev/reference/generated/scipy.stats.entropy.html
from scipy.stats import entropy

print entropy(pk=hist_lena, qk=hist_lena) # nan
print entropy(pk=hist_china, qk=hist_china) # -0.0
print entropy(pk=hist_flower, qk=hist_flower) # nan

我期待零(无符号?)作为结果。

我是否正确应用熵函数? 在图像直方图上应用此功能似乎是正确的吗?

0 个答案:

没有答案