问题描述
对于我正在学习的课程,我将实现深梦以实现特征可视化。我选择在浏览器中实现这个项目。至于我的背景,我是机器学习和 tensorflow.js 的新手。
大多数情况下,我遵循了 Tensorflow 的 Python API 指南:
https://www.tensorflow.org/tutorials/generative/deepdream#calculate_loss
到目前为止,我能够实现梯度上升之前的所有步骤。我有一种感觉,计算 梯度 将是一个挑战,因为 Python API 有这个方便的 tf.GradientTape() 构造,而 tensorflow.js 没有。根据我的理解,我必须改用 tf.grad() 或 tf.grads()。
这是我的损失函数:
function calc_loss(model,img_tensor) {
const activations = [].concat(model.predict(img_tensor));
const losses = activations.map( v => tf.mean(v))
const means = losses.reduce((acc,val) => {
acc = tf.add(acc,val);
return acc;
})
return tf.sum(means) // unsure if tf.sum() is needed here
}
我将特征提取模型和一个 tensor4d 传递给它,它返回一个值为 1 的张量。
我使用的(部分)梯度上升函数:
function gradient_ascent(model,img_tensor) {
const img_batch = img_tensor.expandDims(0);
const loss_function = (input) => calc_loss(model,input);
const grad_function = tf.grad(loss_function);
return grad_function(img_batch)
}
它抛出的错误:
Uncaught (in promise) TypeError: x is undefined
clone http://127.0.0.1:8080/tf.2.8.2.js:17122
saved http://127.0.0.1:8080/tf.2.8.2.js:17373
saveTensorsForBackwardMode http://127.0.0.1:8080/tf.2.8.2.js:17372
kernelFunc http://127.0.0.1:8080/tf.2.8.2.js:17277
runKernelFunc http://127.0.0.1:8080/tf.2.8.2.js:17324
scopedRun http://127.0.0.1:8080/tf.2.8.2.js:17094
runKernelFunc http://127.0.0.1:8080/tf.2.8.2.js:17318
runKernel http://127.0.0.1:8080/tf.2.8.2.js:17171
batchnorm_ http://127.0.0.1:8080/tf.2.8.2.js:26574
f2 http://127.0.0.1:8080/tf.2.8.2.js:18338
batchnorm4d_ http://127.0.0.1:8080/tf.2.8.2.js:26746
f2 http://127.0.0.1:8080/tf.2.8.2.js:18338
batchnormalization http://127.0.0.1:8080/tf.2.8.2.js:72769
normalizeInference http://127.0.0.1:8080/tf.2.8.2.js:72966
call http://127.0.0.1:8080/tf.2.8.2.js:72971
tidy http://127.0.0.1:8080/tf.2.8.2.js:17080
scopedRun http://127.0.0.1:8080/tf.2.8.2.js:17094
tidy http://127.0.0.1:8080/tf.2.8.2.js:17075
tidy http://127.0.0.1:8080/tf.2.8.2.js:24132
call http://127.0.0.1:8080/tf.2.8.2.js:72942
apply http://127.0.0.1:8080/tf.2.8.2.js:56063
nameScope http://127.0.0.1:8080/tf.2.8.2.js:53015
apply http://127.0.0.1:8080/tf.2.8.2.js:56019
execute http://127.0.0.1:8080/tf.2.8.2.js:59585
batchOuts http://127.0.0.1:8080/tf.2.8.2.js:63644
tidy http://127.0.0.1:8080/tf.2.8.2.js:17080
scopedRun http://127.0.0.1:8080/tf.2.8.2.js:17094
tidy http://127.0.0.1:8080/tf.2.8.2.js:17075
tidy http://127.0.0.1:8080/tf.2.8.2.js:24132
_loop2 http://127.0.0.1:8080/tf.2.8.2.js:63620
predictLoop http://127.0.0.1:8080/tf.2.8.2.js:63652
tidy http://127.0.0.1:8080/tf.2.8.2.js:17080
scopedRun http://127.0.0.1:8080/tf.2.8.2.js:17094
tidy http://127.0.0.1:8080/tf.2.8.2.js:17075
tidy http://127.0.0.1:8080/tf.2.8.2.js:24132
predictLoop http://127.0.0.1:8080/tf.2.8.2.js:63601
predict http://127.0.0.1:8080/tf.2.8.2.js:63704
calc_loss http://127.0.0.1:8080/utils.js:103
loss_function http://127.0.0.1:8080/utils.js:124
gradients http://127.0.0.1:8080/tf.2.8.2.js:29870
tidy http://127.0.0.1:8080/tf.2.8.2.js:17080
scopedRun http://127.0.0.1:8080/tf.2.8.2.js:17094
tidy http://127.0.0.1:8080/tf.2.8.2.js:17075
y http://127.0.0.1:8080/tf.2.8.2.js:17798
scopedRun http://127.0.0.1:8080/tf.2.8.2.js:17094
gradients http://127.0.0.1:8080/tf.2.8.2.js:17793
grad http://127.0.0.1:8080/tf.2.8.2.js:29869
tidy http://127.0.0.1:8080/tf.2.8.2.js:17080
scopedRun http://127.0.0.1:8080/tf.2.8.2.js:17094
tidy http://127.0.0.1:8080/tf.2.8.2.js:17075
grad http://127.0.0.1:8080/tf.2.8.2.js:29868
gradient_ascent http://127.0.0.1:8080/utils.js:127
handleTest http://127.0.0.1:8080/script.js:74
promise callback*handleTest/< http://127.0.0.1:8080/script.js:69
promise callback*handleTest http://127.0.0.1:8080/script.js:68
EventListener.handleEvent* http://127.0.0.1:8080/script.js:126
我尝试过的:
- 我已经用 this repo (tfjs-examples) 中的 inputGradientAscent() 替换了损失函数和计算梯度的函数,但得到了类似的错误。而不是 x 未定义,它显示 _this2.gamma 未定义。
- 我没有通过使用调试器获得任何见解。
- 我使用过 tf.js 版本 2.0、2.4、2.7 和现在的 2.8.2,结果相同。
解决方法
暂无找到可以解决该程序问题的有效方法,小编努力寻找整理中!
如果你已经找到好的解决方法,欢迎将解决方案带上本链接一起发送给小编。
小编邮箱:dio#foxmail.com (将#修改为@)