当一些损失来自 PyTorch 中的特征向量时,如何制作 backword?

问题描述

我的损失好像

e1,v1 = torch.eig(A)
e2,v2 = torch.eig(B)
sim = torch.matmul(v1,v2.permute(0,2,1))
loss_sim = torch.sum(sim)
loss_sim.backward()

其中 AB 是形状为 SEQUENCE_LENGH * SEQUENCE_LENGTH 的中间张量

并且我的代码遇到错误

File "/usr/local/Anaconda3/lib/python3.8/site-packages/torch/tensor.py",line 185,in backward torch.autograd.backward(self,gradient,retain_graph,create_graph) File "/usr/local/Anaconda3/lib/python3.8/site-packages/torch/autograd/__init__.py",line 125,in backward Variable._execution_engine.run_backward( RuntimeError: eig_backward: Backward calculation does not support complex eigenvalues at the moment. Exception raised from eig_backward at /opt/conda/conda-bld/pytorch_1595629395347/work/torch/csrc/autograd/generated/Functions.cpp:1877 (most recent call first): frame #0: c10::Error::Error(c10::SourceLocation,std::string) + 0x4d (0x7f74932e077d in /usr/local/Anaconda3/lib/python3.8/site-packages/torch/lib/libc10.so) frame #1: torch::autograd::generated::EigBackward::apply(std::vector<at::Tensor,std::allocator<at::Tensor> >&&) + 0xb5d (0x7f74cc4b894d in /usr/local/Anaconda3/lib/python3.8/site-packages/torch/lib/libtorch_cpu.so) frame #2: <unkNown function> + 0x30d1017 (0x7f74ccb0c017 in /usr/local/Anaconda3/lib/python3.8/site-packages/torch/lib/libtorch_cpu.so) frame #3: torch::autograd::Engine::evaluate_function(std::shared_ptr<torch::autograd::GraphTask>&,torch::autograd::Node*,torch::autograd::InputBuffer&,std::shared_ptr<torch::autograd::ReadyQueue> const&) + 0x1400 (0x7f74ccb07860 in /usr/local/Anaconda3/lib/python3.8/site-packages/torch/lib/libtorch_cpu.so) frame #4: torch::autograd::Engine::thread_main(std::shared_ptr<torch::autograd::GraphTask> const&) + 0x451 (0x7f74ccb08401 in /usr/local/Anaconda3/lib/python3.8/site-packages/torch/lib/libtorch_cpu.so) frame #5: torch::autograd::Engine::thread_init(int,std::shared_ptr<torch::autograd::ReadyQueue> const&,bool) + 0x89 (0x7f74ccb00579 in /usr/local/Anaconda3/lib/python3.8/site-packages/torch/lib/libtorch_cpu.so) frame #6: torch::autograd::python::PythonEngine::thread_init(int,bool) + 0x4a (0x7f74d0e2a1ba in /usr/local/Anaconda3/lib/python3.8/site-packages/torch/lib/libtorch_python.so) frame #7: <unkNown function> + 0xc819d (0x7f74d393719d in /usr/local/Anaconda3/lib/python3.8/site-packages/torch/lib/../../../.././libstdc++.so.6) frame #8: <unkNown function> + 0x76ba (0x7f74ed28c6ba in /lib/x86_64-linux-gnu/libpthread.so.0) frame #9: clone + 0x6d (0x7f74ecfc241d in /lib/x86_64-linux-gnu/libc.so.6)

如何使用 torch.eig() 完成 backword?

解决方法

这个例子有很多问题。

  1. torch.eig 的后向仅适用于具有实特征值 as per the documentation 的矩阵
  2. 这种损失没有明确定义。如 torch.linalg.eigthe eigenvalues are not uniquely defined 的文档中所述。因此,这种损失并没有明确定义,其梯度也没有明确定义。

第一个问题很容易解决:只需使用torch.linalg.eig,它返回矩阵的复数特征值。对于第二个问题,您需要更多地考虑您的模型以及您是否真的要使用特征值分解。请查看 torch.linalg 文档,因为它提供了一些有关此函数与其他函数之间关系的有用评论,这些评论可能对您有用。