问题描述
我正在尝试使用tf.GradientTape在多层中自定义梯度下降方法。 Tensorflow教程提供了没有多层的回归示例。请帮忙。 网络具有隐藏层(L01)。 我想知道如何计算损失与w02之间的梯度,以及误差在L01与w01之间的梯度。
# ---------------------------------------------
# model development
w01 = tf.Variable(tf.random.uniform(shape=(input_dim,10)))
b01 = tf.Variable(tf.zeros(shape=(10,)))
w02 = tf.Variable(tf.random.uniform(shape=(10,output_dim)))
b02 = tf.Variable(tf.zeros(shape=(output_dim,)))
def compute_predictions(features):
# return oHypothesis(features)
L01 = tf.matmul(features,w01) + b01
L01 = tf.nn.sigmoid(L01)
L02 = tf.matmul(L01,w02) + b02
L02 = tf.nn.sigmoid(L02)
oHypothesis = L02
return oHypothesis
def compute_diff01(features):
# return oHypothesis(features)
L01 = tf.matmul(features,w01) + b01
L01 = tf.nn.sigmoid(L01)
return L01
def compute_loss(y,predictions):
return tf.reduce_mean(tf.square(y - predictions))
# ---------------------------------------------
# model training
def train_on_batch(x,y):
# ??? question here. How to calculate weights ???
**with tf.GradientTape() as tape:
predictions = compute_predictions(x)
loss = tf.reduce_mean(tf.square(y - predictions))
dloss_dw02,dloss_db02 = tape.gradient(loss,[w02,b02])
diff01 = tf.reduce_mean(tf.square(y - predictions))
dL01_dw01,dL01_db01 = tape.gradient(diff01,[w01,b01])
w01.assign_sub(learning_rate * dloss_dw01)
b01.assign_sub(learning_rate * dloss_db01)
w02.assign_sub(learning_rate * dloss_dw02)
b02.assign_sub(learning_rate * dloss_db02)**
return loss
for epoch in range(10):
for step,(x,y) in enumerate(dataset):
loss = train_on_batch(x,y)
解决方法
暂无找到可以解决该程序问题的有效方法,小编努力寻找整理中!
如果你已经找到好的解决方法,欢迎将解决方案带上本链接一起发送给小编。
小编邮箱:dio#foxmail.com (将#修改为@)