如何获得 GPyTorch 回归器的 SHAP 值?

问题描述

我在获取 GPyTorch 回归器模型的 SHAP 值时遇到问题。

以下是我使用的 GPyTorch 代码

import math
import torch
import gpytorch


'''
Data
'''

num_features = 3

# Training data is 100 points in [0,1] 
train_x = torch.rand(100,num_features)

# True function is sin(2*pi*x) with Gaussian noise
train_y = torch.sin(train_x[:,0] * (2 * math.pi)) + torch.randn(train_x[:,0].size()) * math.sqrt(0.04)

test_x = torch.rand(50,num_features)


'''
GP Model
'''

kernel = gpytorch.kernels.RBFKernel()    

# We will use the simplest form of GP model,exact inference
class ExactGPModel(gpytorch.models.ExactGP):
    def __init__(self,train_x,train_y,likelihood,kernel):
        super(ExactGPModel,self).__init__(train_x,likelihood)
        self.mean_module = gpytorch.means.ConstantMean()
        self.covar_module = gpytorch.kernels.ScaleKernel(kernel)   

    def forward(self,x):
        mean_x = self.mean_module(x)
        covar_x = self.covar_module(x)
        return gpytorch.distributions.Multivariatenormal(mean_x,covar_x)

# initialize likelihood and model
likelihood = gpytorch.likelihoods.GaussianLikelihood()
model = ExactGPModel(train_x,kernel)
    

'''
Train
'''

training_iter = 50


# Find optimal model hyperparameters
# Put model on train mode
model.train()
likelihood.train()


# Use the adam optimizer
optimizer = torch.optim.Adam(model.parameters(),lr=0.1)  

# "Loss" for GPs - the marginal log likelihood
mll = gpytorch.mlls.ExactMarginalLogLikelihood(likelihood,model)

for i in range(training_iter):
    # Zero gradients from prevIoUs iteration
    optimizer.zero_grad()
    # Output from model
    output = model(train_x)
    # Calc loss and backprop gradients
    loss = -mll(output,train_y)
    loss.backward()
    print('Iter %d/%d - Loss: %.3f   lengthscale: %.3f   noise: %.3f' % (
        i + 1,training_iter,loss.item(),#model.covar_module.base_kernel.kernels[0].lengthscale.item(),model.covar_module.base_kernel.lengthscale.item(),model.likelihood.noise.item()   # ?
    ))
    optimizer.step()


'''
Prediction
'''

# Get into evaluation (predictive posterior) model
model.eval()
likelihood.eval()

with torch.no_grad(),gpytorch.settings.fast_pred_var():
    y_pred = likelihood(model(test_x))

y_pred_mean = y_pred.mean   

我尝试了不同的方法来从模型中获取 SHAP 值,但都没有成功。我试过的:

"""
SHAP
"""
import shap

train_x_arr = train_x.detach().numpy() 
test_x_arr = test_x.detach().numpy()

第一次尝试:

# explain all the predictions in the test set
explainer = shap.KernelExplainer(model,train_x_arr)
shap_values = explainer.shap_values(test_x_arr)

错误

AttributeError: 'numpy.ndarray' object has no attribute 'ndimension'

第二次尝试:

explainer = shap.Explainer(model)
shap_values = explainer(test_x_arr)

错误

TypeError: 'nonetype' object is not callable

如果您有任何解决此问题的想法,请告诉我。

解决方法

暂无找到可以解决该程序问题的有效方法,小编努力寻找整理中!

如果你已经找到好的解决方法,欢迎将解决方案带上本链接一起发送给小编。

小编邮箱:dio#foxmail.com (将#修改为@)

相关问答

Selenium Web驱动程序和Java。元素在(x,y)点处不可单击。其...
Python-如何使用点“。” 访问字典成员?
Java 字符串是不可变的。到底是什么意思?
Java中的“ final”关键字如何工作?(我仍然可以修改对象。...
“loop:”在Java代码中。这是什么,为什么要编译?
java.lang.ClassNotFoundException:sun.jdbc.odbc.JdbcOdbc...