如何在 Python 中使用 Kullback-Leibler 方法最小化威布尔分布的参数?

问题描述

我想通过使用 Kullbak-Leibler 方法最小化参数来找到 Weibull 分布的参数。我找到了一个代码 here 做同样的事情。我用威布尔分布替换了原始代码中的正态分布。我不知道为什么我得到“Nan”参数和“Nan”Kullback-Leibler 散度值。任何人都可以帮忙吗?

import numpy as np
import pandas as pd
import numpy as np
from matplotlib import pyplot as plt
import tensorflow.compat.v1 as tf
tf.disable_v2_behavior()
import seaborn as sns
sns.set()
from scipy.stats import weibull_min

learning_rate = 0.001
epochs = 100

x = np.arange(0,2000,0.001)
p_pdf=weibull_min.pdf(x,1.055,468).reshape(1,-1)
p = tf.placeholder(tf.float64,shape=p_pdf.shape)

alpha = tf.Variable(np.zeros(1))
beta = tf.Variable(np.eye(1))

weibull=(beta / alpha) * ((x / alpha)**(beta - 1)) * tf.exp(-((x / alpha)**beta))
q = weibull
kl_divergence = tf.reduce_sum(tf.where(p == 0,tf.zeros(p_pdf.shape,tf.float64),p * tf.log(p / q)))
optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(kl_divergence)
init = tf.global_variables_initializer()

with tf.Session() as sess:
    sess.run(init)
    history = []    
    alphas = []
    betas = []
    
    
    for i in range(epochs):
        sess.run(optimizer,{ p: p_pdf })
        
        if i % 10 == 0:
            history.append(sess.run(kl_divergence,{ p: p_pdf }))
            alphas.append(sess.run(alpha)[0])
            betas.append(sess.run(beta)[0][0])
            
    for a,b in zip(alphas,betas):

        q_pdf =weibull_min.pdf(x,b,a)
        plt.plot(x,q_pdf.reshape(-1,1),c='red')

plt.title('KL(P||Q) = %1.3f' % history[-1])
plt.plot(x,p_pdf.reshape(-1,linewidth=3)
plt.show()  
plt.plot(history)
plt.show()   
sess.close()

解决方法

暂无找到可以解决该程序问题的有效方法,小编努力寻找整理中!

如果你已经找到好的解决方法,欢迎将解决方案带上本链接一起发送给小编。

小编邮箱:dio#foxmail.com (将#修改为@)