GPyTorch 多类分类; “gpytorch.likelihoods.SoftmaxLikelihood”中的 num_features 是什么?

问题描述

我正在尝试通过将“likelihoods.BernoulliLikelihood”更改为“likelihoods.softmaxLikelihood”来制作基于 notebook 的多类分类器。

但是,我找不到参数 num_features 的合适值。我尝试了不同的值,但都给出了错误。如果您能在这个问题上指导我,我将不胜感激。

代码

import torch
import gpytorch

from gpytorch.models import AbstractvariationalGP
from gpytorch.variational import CholeskyVariationaldistribution
from gpytorch.variational import VariationalStrategy
from gpytorch.mlls.variational_elbo import VariationalELBO


"""
Data
"""

train_x = torch.linspace(0,1,10)
train_y = torch.tensor([1,-1,1])

num_classes = 3

num_features = 1



"""
Model
"""

class GPClassificationModel(AbstractvariationalGP):
    def __init__(self,train_x):
        variational_distribution = CholeskyVariationaldistribution(train_x.size(0))   
        variational_strategy = VariationalStrategy(self,train_x,variational_distribution)
        super(GPClassificationModel,self).__init__(variational_strategy)
        self.mean_module = gpytorch.means.ConstantMean()
        self.covar_module = gpytorch.kernels.ScaleKernel(gpytorch.kernels.RBFKernel())

    def forward(self,x):
        mean_x = self.mean_module(x)
        covar_x = self.covar_module(x)
        latent_pred = gpytorch.distributions.Multivariatenormal(mean_x,covar_x)
        return latent_pred   
        

# Initialize model and likelihood
model = GPClassificationModel(train_x)
likelihood = gpytorch.likelihoods.softmaxLikelihood(num_features = num_features,num_classes=num_classes)


"""
Train
"""

model.train()
likelihood.train()

optimizer = torch.optim.Adam(model.parameters(),lr=0.1)

# "Loss" for GPs - the marginal log likelihood
# train_y.numel() refers to the amount of training data
mll = VariationalELBO(likelihood,model,train_y.numel())

training_iter = 50
for i in range(training_iter):
    # Zero backpropped gradients from prevIoUs iteration
    optimizer.zero_grad()
    # Get predictive output
    output = model(train_x)
    # Calc loss and backprop gradients
    loss = -mll(output,train_y)
    loss.backward()
    print('Iter %d/%d - Loss: %.3f' % (i + 1,training_iter,loss.item()))
    optimizer.step()

解决方法

暂无找到可以解决该程序问题的有效方法,小编努力寻找整理中!

如果你已经找到好的解决方法,欢迎将解决方案带上本链接一起发送给小编。

小编邮箱:dio#foxmail.com (将#修改为@)

相关问答

Selenium Web驱动程序和Java。元素在(x,y)点处不可单击。其...
Python-如何使用点“。” 访问字典成员?
Java 字符串是不可变的。到底是什么意思?
Java中的“ final”关键字如何工作?(我仍然可以修改对象。...
“loop:”在Java代码中。这是什么,为什么要编译?
java.lang.ClassNotFoundException:sun.jdbc.odbc.JdbcOdbc...